00:00:00.001 Started by upstream project "autotest-per-patch" build number 126145 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.006 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.779 The recommended git tool is: git 00:00:00.779 using credential 00000000-0000-0000-0000-000000000002 00:00:00.781 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.792 Fetching changes from the remote Git repository 00:00:00.794 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.806 Using shallow fetch with depth 1 00:00:00.806 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.806 > git --version # timeout=10 00:00:00.819 > git --version # 'git version 2.39.2' 00:00:00.819 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.830 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.830 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:06.503 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:06.512 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:06.522 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:06.522 > git config core.sparsecheckout # timeout=10 00:00:06.531 > git read-tree -mu HEAD # timeout=10 00:00:06.547 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:06.565 Commit message: "inventory: add WCP3 to free inventory" 00:00:06.565 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:06.648 [Pipeline] Start of Pipeline 00:00:06.667 [Pipeline] library 00:00:06.669 Loading library shm_lib@master 00:00:06.669 Library shm_lib@master is cached. Copying from home. 00:00:06.691 [Pipeline] node 00:00:06.702 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:06.704 [Pipeline] { 00:00:06.715 [Pipeline] catchError 00:00:06.716 [Pipeline] { 00:00:06.730 [Pipeline] wrap 00:00:06.742 [Pipeline] { 00:00:06.749 [Pipeline] stage 00:00:06.751 [Pipeline] { (Prologue) 00:00:06.920 [Pipeline] sh 00:00:07.202 + logger -p user.info -t JENKINS-CI 00:00:07.217 [Pipeline] echo 00:00:07.218 Node: WFP50 00:00:07.233 [Pipeline] sh 00:00:07.528 [Pipeline] setCustomBuildProperty 00:00:07.542 [Pipeline] echo 00:00:07.543 Cleanup processes 00:00:07.549 [Pipeline] sh 00:00:07.831 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.831 3264728 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.845 [Pipeline] sh 00:00:08.131 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.131 ++ grep -v 'sudo pgrep' 00:00:08.131 ++ awk '{print $1}' 00:00:08.131 + sudo kill -9 00:00:08.131 + true 00:00:08.148 [Pipeline] cleanWs 00:00:08.156 [WS-CLEANUP] Deleting project workspace... 00:00:08.156 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.162 [WS-CLEANUP] done 00:00:08.166 [Pipeline] setCustomBuildProperty 00:00:08.178 [Pipeline] sh 00:00:08.459 + sudo git config --global --replace-all safe.directory '*' 00:00:08.535 [Pipeline] httpRequest 00:00:08.568 [Pipeline] echo 00:00:08.569 Sorcerer 10.211.164.101 is alive 00:00:08.576 [Pipeline] httpRequest 00:00:08.579 HttpMethod: GET 00:00:08.579 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:08.580 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:08.600 Response Code: HTTP/1.1 200 OK 00:00:08.601 Success: Status code 200 is in the accepted range: 200,404 00:00:08.601 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:34.216 [Pipeline] sh 00:00:34.499 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:34.514 [Pipeline] httpRequest 00:00:34.531 [Pipeline] echo 00:00:34.533 Sorcerer 10.211.164.101 is alive 00:00:34.541 [Pipeline] httpRequest 00:00:34.545 HttpMethod: GET 00:00:34.545 URL: http://10.211.164.101/packages/spdk_9b8dc23b2eb0468652102fbabf58d7b3c30721e2.tar.gz 00:00:34.546 Sending request to url: http://10.211.164.101/packages/spdk_9b8dc23b2eb0468652102fbabf58d7b3c30721e2.tar.gz 00:00:34.552 Response Code: HTTP/1.1 200 OK 00:00:34.552 Success: Status code 200 is in the accepted range: 200,404 00:00:34.553 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_9b8dc23b2eb0468652102fbabf58d7b3c30721e2.tar.gz 00:02:09.785 [Pipeline] sh 00:02:10.062 + tar --no-same-owner -xf spdk_9b8dc23b2eb0468652102fbabf58d7b3c30721e2.tar.gz 00:02:14.289 [Pipeline] sh 00:02:14.568 + git -C spdk log --oneline -n5 00:02:14.568 9b8dc23b2 accel: introduce tasks in sequence limit 00:02:14.568 719d03c6a sock/uring: only register net impl if supported 00:02:14.568 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:02:14.568 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:02:14.568 6c7c1f57e accel: add sequence outstanding stat 00:02:14.583 [Pipeline] } 00:02:14.603 [Pipeline] // stage 00:02:14.614 [Pipeline] stage 00:02:14.616 [Pipeline] { (Prepare) 00:02:14.636 [Pipeline] writeFile 00:02:14.655 [Pipeline] sh 00:02:14.934 + logger -p user.info -t JENKINS-CI 00:02:14.952 [Pipeline] sh 00:02:15.237 + logger -p user.info -t JENKINS-CI 00:02:15.250 [Pipeline] sh 00:02:15.528 + cat autorun-spdk.conf 00:02:15.528 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:15.528 SPDK_TEST_BLOCKDEV=1 00:02:15.528 SPDK_TEST_ISAL=1 00:02:15.528 SPDK_TEST_CRYPTO=1 00:02:15.528 SPDK_TEST_REDUCE=1 00:02:15.528 SPDK_TEST_VBDEV_COMPRESS=1 00:02:15.528 SPDK_RUN_UBSAN=1 00:02:15.534 RUN_NIGHTLY=0 00:02:15.539 [Pipeline] readFile 00:02:15.565 [Pipeline] withEnv 00:02:15.567 [Pipeline] { 00:02:15.579 [Pipeline] sh 00:02:15.858 + set -ex 00:02:15.858 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:02:15.858 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:15.858 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:15.858 ++ SPDK_TEST_BLOCKDEV=1 00:02:15.858 ++ SPDK_TEST_ISAL=1 00:02:15.858 ++ SPDK_TEST_CRYPTO=1 00:02:15.859 ++ SPDK_TEST_REDUCE=1 00:02:15.859 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:15.859 ++ SPDK_RUN_UBSAN=1 00:02:15.859 ++ RUN_NIGHTLY=0 00:02:15.859 + case $SPDK_TEST_NVMF_NICS in 00:02:15.859 + DRIVERS= 00:02:15.859 + [[ -n '' ]] 00:02:15.859 + exit 0 00:02:15.868 [Pipeline] } 00:02:15.889 [Pipeline] // withEnv 00:02:15.895 [Pipeline] } 00:02:15.914 [Pipeline] // stage 00:02:15.924 [Pipeline] catchError 00:02:15.926 [Pipeline] { 00:02:15.942 [Pipeline] timeout 00:02:15.943 Timeout set to expire in 40 min 00:02:15.944 [Pipeline] { 00:02:15.961 [Pipeline] stage 00:02:15.964 [Pipeline] { (Tests) 00:02:15.980 [Pipeline] sh 00:02:16.260 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:02:16.260 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:02:16.260 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:02:16.260 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:02:16.260 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:16.260 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:02:16.260 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:02:16.260 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:16.260 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:02:16.260 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:16.260 + [[ crypto-phy-autotest == pkgdep-* ]] 00:02:16.260 + cd /var/jenkins/workspace/crypto-phy-autotest 00:02:16.260 + source /etc/os-release 00:02:16.260 ++ NAME='Fedora Linux' 00:02:16.260 ++ VERSION='38 (Cloud Edition)' 00:02:16.260 ++ ID=fedora 00:02:16.260 ++ VERSION_ID=38 00:02:16.260 ++ VERSION_CODENAME= 00:02:16.260 ++ PLATFORM_ID=platform:f38 00:02:16.260 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:02:16.260 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:16.260 ++ LOGO=fedora-logo-icon 00:02:16.260 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:02:16.260 ++ HOME_URL=https://fedoraproject.org/ 00:02:16.260 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:02:16.260 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:16.260 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:16.260 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:16.260 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:02:16.260 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:16.260 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:02:16.260 ++ SUPPORT_END=2024-05-14 00:02:16.260 ++ VARIANT='Cloud Edition' 00:02:16.260 ++ VARIANT_ID=cloud 00:02:16.260 + uname -a 00:02:16.260 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:02:16.260 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:02:19.541 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:02:19.542 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:02:19.542 Hugepages 00:02:19.542 node hugesize free / total 00:02:19.542 node0 1048576kB 0 / 0 00:02:19.542 node0 2048kB 0 / 0 00:02:19.542 node1 1048576kB 0 / 0 00:02:19.542 node1 2048kB 0 / 0 00:02:19.542 00:02:19.542 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:19.542 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:19.542 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:19.542 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:02:19.542 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:19.542 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:19.542 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:19.542 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:19.799 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:19.799 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:19.799 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:19.799 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:19.799 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:02:19.799 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:02:19.799 + rm -f /tmp/spdk-ld-path 00:02:19.799 + source autorun-spdk.conf 00:02:19.799 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:19.799 ++ SPDK_TEST_BLOCKDEV=1 00:02:19.799 ++ SPDK_TEST_ISAL=1 00:02:19.799 ++ SPDK_TEST_CRYPTO=1 00:02:19.799 ++ SPDK_TEST_REDUCE=1 00:02:19.799 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:19.799 ++ SPDK_RUN_UBSAN=1 00:02:19.799 ++ RUN_NIGHTLY=0 00:02:19.799 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:19.799 + [[ -n '' ]] 00:02:19.799 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:19.799 + for M in /var/spdk/build-*-manifest.txt 00:02:19.799 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:19.799 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:19.799 + for M in /var/spdk/build-*-manifest.txt 00:02:19.799 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:19.799 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:19.799 ++ uname 00:02:19.799 + [[ Linux == \L\i\n\u\x ]] 00:02:19.799 + sudo dmesg -T 00:02:19.799 + sudo dmesg --clear 00:02:19.799 + dmesg_pid=3266208 00:02:19.799 + [[ Fedora Linux == FreeBSD ]] 00:02:19.799 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:19.799 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:19.799 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:19.799 + sudo dmesg -Tw 00:02:19.799 + [[ -x /usr/src/fio-static/fio ]] 00:02:19.799 + export FIO_BIN=/usr/src/fio-static/fio 00:02:19.799 + FIO_BIN=/usr/src/fio-static/fio 00:02:19.799 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:19.799 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:19.799 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:19.799 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:19.799 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:19.799 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:19.799 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:19.799 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:19.799 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:19.799 Test configuration: 00:02:19.799 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:19.799 SPDK_TEST_BLOCKDEV=1 00:02:19.799 SPDK_TEST_ISAL=1 00:02:19.799 SPDK_TEST_CRYPTO=1 00:02:19.799 SPDK_TEST_REDUCE=1 00:02:19.799 SPDK_TEST_VBDEV_COMPRESS=1 00:02:19.799 SPDK_RUN_UBSAN=1 00:02:20.057 RUN_NIGHTLY=0 22:09:30 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:20.057 22:09:30 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:20.057 22:09:30 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:20.057 22:09:30 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:20.057 22:09:30 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:20.057 22:09:30 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:20.057 22:09:30 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:20.057 22:09:30 -- paths/export.sh@5 -- $ export PATH 00:02:20.057 22:09:30 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:20.057 22:09:30 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:20.057 22:09:30 -- common/autobuild_common.sh@444 -- $ date +%s 00:02:20.057 22:09:30 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720814970.XXXXXX 00:02:20.057 22:09:30 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720814970.arj5R8 00:02:20.057 22:09:30 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:02:20.057 22:09:30 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:02:20.057 22:09:30 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:02:20.057 22:09:30 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:20.057 22:09:30 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:20.057 22:09:30 -- common/autobuild_common.sh@460 -- $ get_config_params 00:02:20.057 22:09:30 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:02:20.057 22:09:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:20.057 22:09:30 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:02:20.057 22:09:30 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:02:20.057 22:09:30 -- pm/common@17 -- $ local monitor 00:02:20.057 22:09:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:20.057 22:09:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:20.057 22:09:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:20.057 22:09:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:20.057 22:09:30 -- pm/common@25 -- $ sleep 1 00:02:20.057 22:09:30 -- pm/common@21 -- $ date +%s 00:02:20.057 22:09:30 -- pm/common@21 -- $ date +%s 00:02:20.057 22:09:30 -- pm/common@21 -- $ date +%s 00:02:20.057 22:09:30 -- pm/common@21 -- $ date +%s 00:02:20.057 22:09:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720814970 00:02:20.057 22:09:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720814970 00:02:20.057 22:09:30 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720814970 00:02:20.057 22:09:30 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720814970 00:02:20.057 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720814970_collect-vmstat.pm.log 00:02:20.057 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720814970_collect-cpu-load.pm.log 00:02:20.057 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720814970_collect-cpu-temp.pm.log 00:02:20.057 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720814970_collect-bmc-pm.bmc.pm.log 00:02:20.990 22:09:31 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:02:20.990 22:09:31 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:20.990 22:09:31 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:20.990 22:09:31 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:20.990 22:09:31 -- spdk/autobuild.sh@16 -- $ date -u 00:02:20.990 Fri Jul 12 08:09:31 PM UTC 2024 00:02:20.990 22:09:31 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:20.990 v24.09-pre-203-g9b8dc23b2 00:02:20.990 22:09:31 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:20.990 22:09:31 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:20.990 22:09:31 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:20.990 22:09:31 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:02:20.990 22:09:31 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:20.990 22:09:31 -- common/autotest_common.sh@10 -- $ set +x 00:02:21.247 ************************************ 00:02:21.247 START TEST ubsan 00:02:21.247 ************************************ 00:02:21.247 22:09:31 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:02:21.247 using ubsan 00:02:21.247 00:02:21.247 real 0m0.000s 00:02:21.247 user 0m0.000s 00:02:21.247 sys 0m0.000s 00:02:21.247 22:09:31 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:21.247 22:09:31 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:21.247 ************************************ 00:02:21.247 END TEST ubsan 00:02:21.247 ************************************ 00:02:21.247 22:09:31 -- common/autotest_common.sh@1142 -- $ return 0 00:02:21.247 22:09:31 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:02:21.247 22:09:31 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:21.247 22:09:31 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:02:21.247 22:09:31 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:02:21.247 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:02:21.247 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:21.812 Using 'verbs' RDMA provider 00:02:38.048 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:52.952 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:52.952 Creating mk/config.mk...done. 00:02:52.952 Creating mk/cc.flags.mk...done. 00:02:52.952 Type 'make' to build. 00:02:52.952 22:10:01 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:02:52.952 22:10:01 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:02:52.952 22:10:01 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:52.952 22:10:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:52.952 ************************************ 00:02:52.952 START TEST make 00:02:52.952 ************************************ 00:02:52.952 22:10:01 make -- common/autotest_common.sh@1123 -- $ make -j72 00:02:52.952 make[1]: Nothing to be done for 'all'. 00:03:31.691 The Meson build system 00:03:31.691 Version: 1.3.1 00:03:31.691 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:03:31.691 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:03:31.691 Build type: native build 00:03:31.691 Program cat found: YES (/usr/bin/cat) 00:03:31.691 Project name: DPDK 00:03:31.691 Project version: 24.03.0 00:03:31.691 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:03:31.691 C linker for the host machine: cc ld.bfd 2.39-16 00:03:31.691 Host machine cpu family: x86_64 00:03:31.691 Host machine cpu: x86_64 00:03:31.691 Message: ## Building in Developer Mode ## 00:03:31.691 Program pkg-config found: YES (/usr/bin/pkg-config) 00:03:31.691 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:03:31.691 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:03:31.691 Program python3 found: YES (/usr/bin/python3) 00:03:31.691 Program cat found: YES (/usr/bin/cat) 00:03:31.691 Compiler for C supports arguments -march=native: YES 00:03:31.691 Checking for size of "void *" : 8 00:03:31.691 Checking for size of "void *" : 8 (cached) 00:03:31.691 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:03:31.691 Library m found: YES 00:03:31.691 Library numa found: YES 00:03:31.691 Has header "numaif.h" : YES 00:03:31.691 Library fdt found: NO 00:03:31.691 Library execinfo found: NO 00:03:31.691 Has header "execinfo.h" : YES 00:03:31.691 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:31.691 Run-time dependency libarchive found: NO (tried pkgconfig) 00:03:31.691 Run-time dependency libbsd found: NO (tried pkgconfig) 00:03:31.691 Run-time dependency jansson found: NO (tried pkgconfig) 00:03:31.691 Run-time dependency openssl found: YES 3.0.9 00:03:31.691 Run-time dependency libpcap found: YES 1.10.4 00:03:31.691 Has header "pcap.h" with dependency libpcap: YES 00:03:31.691 Compiler for C supports arguments -Wcast-qual: YES 00:03:31.691 Compiler for C supports arguments -Wdeprecated: YES 00:03:31.691 Compiler for C supports arguments -Wformat: YES 00:03:31.691 Compiler for C supports arguments -Wformat-nonliteral: NO 00:03:31.691 Compiler for C supports arguments -Wformat-security: NO 00:03:31.691 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:31.691 Compiler for C supports arguments -Wmissing-prototypes: YES 00:03:31.691 Compiler for C supports arguments -Wnested-externs: YES 00:03:31.691 Compiler for C supports arguments -Wold-style-definition: YES 00:03:31.691 Compiler for C supports arguments -Wpointer-arith: YES 00:03:31.691 Compiler for C supports arguments -Wsign-compare: YES 00:03:31.691 Compiler for C supports arguments -Wstrict-prototypes: YES 00:03:31.691 Compiler for C supports arguments -Wundef: YES 00:03:31.691 Compiler for C supports arguments -Wwrite-strings: YES 00:03:31.691 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:03:31.691 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:03:31.691 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:31.691 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:03:31.691 Program objdump found: YES (/usr/bin/objdump) 00:03:31.691 Compiler for C supports arguments -mavx512f: YES 00:03:31.691 Checking if "AVX512 checking" compiles: YES 00:03:31.691 Fetching value of define "__SSE4_2__" : 1 00:03:31.691 Fetching value of define "__AES__" : 1 00:03:31.691 Fetching value of define "__AVX__" : 1 00:03:31.691 Fetching value of define "__AVX2__" : 1 00:03:31.691 Fetching value of define "__AVX512BW__" : 1 00:03:31.691 Fetching value of define "__AVX512CD__" : 1 00:03:31.691 Fetching value of define "__AVX512DQ__" : 1 00:03:31.691 Fetching value of define "__AVX512F__" : 1 00:03:31.691 Fetching value of define "__AVX512VL__" : 1 00:03:31.691 Fetching value of define "__PCLMUL__" : 1 00:03:31.691 Fetching value of define "__RDRND__" : 1 00:03:31.691 Fetching value of define "__RDSEED__" : 1 00:03:31.691 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:03:31.691 Fetching value of define "__znver1__" : (undefined) 00:03:31.691 Fetching value of define "__znver2__" : (undefined) 00:03:31.691 Fetching value of define "__znver3__" : (undefined) 00:03:31.691 Fetching value of define "__znver4__" : (undefined) 00:03:31.691 Compiler for C supports arguments -Wno-format-truncation: YES 00:03:31.691 Message: lib/log: Defining dependency "log" 00:03:31.691 Message: lib/kvargs: Defining dependency "kvargs" 00:03:31.691 Message: lib/telemetry: Defining dependency "telemetry" 00:03:31.691 Checking for function "getentropy" : NO 00:03:31.691 Message: lib/eal: Defining dependency "eal" 00:03:31.691 Message: lib/ring: Defining dependency "ring" 00:03:31.691 Message: lib/rcu: Defining dependency "rcu" 00:03:31.691 Message: lib/mempool: Defining dependency "mempool" 00:03:31.691 Message: lib/mbuf: Defining dependency "mbuf" 00:03:31.691 Fetching value of define "__PCLMUL__" : 1 (cached) 00:03:31.691 Fetching value of define "__AVX512F__" : 1 (cached) 00:03:31.691 Fetching value of define "__AVX512BW__" : 1 (cached) 00:03:31.691 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:03:31.691 Fetching value of define "__AVX512VL__" : 1 (cached) 00:03:31.691 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:03:31.691 Compiler for C supports arguments -mpclmul: YES 00:03:31.691 Compiler for C supports arguments -maes: YES 00:03:31.691 Compiler for C supports arguments -mavx512f: YES (cached) 00:03:31.691 Compiler for C supports arguments -mavx512bw: YES 00:03:31.691 Compiler for C supports arguments -mavx512dq: YES 00:03:31.691 Compiler for C supports arguments -mavx512vl: YES 00:03:31.691 Compiler for C supports arguments -mvpclmulqdq: YES 00:03:31.691 Compiler for C supports arguments -mavx2: YES 00:03:31.691 Compiler for C supports arguments -mavx: YES 00:03:31.691 Message: lib/net: Defining dependency "net" 00:03:31.691 Message: lib/meter: Defining dependency "meter" 00:03:31.691 Message: lib/ethdev: Defining dependency "ethdev" 00:03:31.691 Message: lib/pci: Defining dependency "pci" 00:03:31.691 Message: lib/cmdline: Defining dependency "cmdline" 00:03:31.691 Message: lib/hash: Defining dependency "hash" 00:03:31.691 Message: lib/timer: Defining dependency "timer" 00:03:31.691 Message: lib/compressdev: Defining dependency "compressdev" 00:03:31.691 Message: lib/cryptodev: Defining dependency "cryptodev" 00:03:31.691 Message: lib/dmadev: Defining dependency "dmadev" 00:03:31.691 Compiler for C supports arguments -Wno-cast-qual: YES 00:03:31.691 Message: lib/power: Defining dependency "power" 00:03:31.692 Message: lib/reorder: Defining dependency "reorder" 00:03:31.692 Message: lib/security: Defining dependency "security" 00:03:31.692 Has header "linux/userfaultfd.h" : YES 00:03:31.692 Has header "linux/vduse.h" : YES 00:03:31.692 Message: lib/vhost: Defining dependency "vhost" 00:03:31.692 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:03:31.692 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:03:31.692 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:03:31.692 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:03:31.692 Compiler for C supports arguments -std=c11: YES 00:03:31.692 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:03:31.692 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:03:31.692 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:03:31.692 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:03:31.692 Run-time dependency libmlx5 found: YES 1.24.44.0 00:03:31.692 Run-time dependency libibverbs found: YES 1.14.44.0 00:03:31.692 Library mtcr_ul found: NO 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:03:31.692 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:03:34.226 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:03:34.226 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:03:34.226 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:03:34.227 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:03:34.227 Configuring mlx5_autoconf.h using configuration 00:03:34.227 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:03:34.227 Run-time dependency libcrypto found: YES 3.0.9 00:03:34.227 Library IPSec_MB found: YES 00:03:34.227 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:03:34.227 Message: drivers/common/qat: Defining dependency "common_qat" 00:03:34.227 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:03:34.227 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:03:34.227 Library IPSec_MB found: YES 00:03:34.227 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:03:34.227 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:03:34.227 Compiler for C supports arguments -std=c11: YES (cached) 00:03:34.227 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:34.227 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:03:34.227 Run-time dependency libisal found: NO (tried pkgconfig) 00:03:34.227 Library libisal found: NO 00:03:34.227 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:03:34.227 Compiler for C supports arguments -std=c11: YES (cached) 00:03:34.227 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:34.227 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:34.227 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:03:34.227 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:03:34.227 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:03:34.227 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:03:34.227 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:03:34.227 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:03:34.227 Program doxygen found: YES (/usr/bin/doxygen) 00:03:34.227 Configuring doxy-api-html.conf using configuration 00:03:34.227 Configuring doxy-api-man.conf using configuration 00:03:34.227 Program mandb found: YES (/usr/bin/mandb) 00:03:34.227 Program sphinx-build found: NO 00:03:34.227 Configuring rte_build_config.h using configuration 00:03:34.227 Message: 00:03:34.227 ================= 00:03:34.227 Applications Enabled 00:03:34.227 ================= 00:03:34.227 00:03:34.227 apps: 00:03:34.227 00:03:34.227 00:03:34.227 Message: 00:03:34.227 ================= 00:03:34.227 Libraries Enabled 00:03:34.227 ================= 00:03:34.227 00:03:34.227 libs: 00:03:34.227 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:03:34.227 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:03:34.227 cryptodev, dmadev, power, reorder, security, vhost, 00:03:34.227 00:03:34.227 Message: 00:03:34.227 =============== 00:03:34.227 Drivers Enabled 00:03:34.227 =============== 00:03:34.227 00:03:34.227 common: 00:03:34.227 mlx5, qat, 00:03:34.227 bus: 00:03:34.227 auxiliary, pci, vdev, 00:03:34.227 mempool: 00:03:34.227 ring, 00:03:34.227 dma: 00:03:34.227 00:03:34.227 net: 00:03:34.227 00:03:34.227 crypto: 00:03:34.227 ipsec_mb, mlx5, 00:03:34.227 compress: 00:03:34.227 isal, mlx5, 00:03:34.227 vdpa: 00:03:34.227 00:03:34.227 00:03:34.227 Message: 00:03:34.227 ================= 00:03:34.227 Content Skipped 00:03:34.227 ================= 00:03:34.227 00:03:34.227 apps: 00:03:34.227 dumpcap: explicitly disabled via build config 00:03:34.227 graph: explicitly disabled via build config 00:03:34.227 pdump: explicitly disabled via build config 00:03:34.227 proc-info: explicitly disabled via build config 00:03:34.227 test-acl: explicitly disabled via build config 00:03:34.227 test-bbdev: explicitly disabled via build config 00:03:34.227 test-cmdline: explicitly disabled via build config 00:03:34.227 test-compress-perf: explicitly disabled via build config 00:03:34.227 test-crypto-perf: explicitly disabled via build config 00:03:34.227 test-dma-perf: explicitly disabled via build config 00:03:34.227 test-eventdev: explicitly disabled via build config 00:03:34.227 test-fib: explicitly disabled via build config 00:03:34.227 test-flow-perf: explicitly disabled via build config 00:03:34.227 test-gpudev: explicitly disabled via build config 00:03:34.227 test-mldev: explicitly disabled via build config 00:03:34.227 test-pipeline: explicitly disabled via build config 00:03:34.227 test-pmd: explicitly disabled via build config 00:03:34.227 test-regex: explicitly disabled via build config 00:03:34.227 test-sad: explicitly disabled via build config 00:03:34.227 test-security-perf: explicitly disabled via build config 00:03:34.227 00:03:34.227 libs: 00:03:34.227 argparse: explicitly disabled via build config 00:03:34.227 metrics: explicitly disabled via build config 00:03:34.227 acl: explicitly disabled via build config 00:03:34.227 bbdev: explicitly disabled via build config 00:03:34.227 bitratestats: explicitly disabled via build config 00:03:34.227 bpf: explicitly disabled via build config 00:03:34.227 cfgfile: explicitly disabled via build config 00:03:34.227 distributor: explicitly disabled via build config 00:03:34.227 efd: explicitly disabled via build config 00:03:34.227 eventdev: explicitly disabled via build config 00:03:34.227 dispatcher: explicitly disabled via build config 00:03:34.227 gpudev: explicitly disabled via build config 00:03:34.227 gro: explicitly disabled via build config 00:03:34.227 gso: explicitly disabled via build config 00:03:34.227 ip_frag: explicitly disabled via build config 00:03:34.227 jobstats: explicitly disabled via build config 00:03:34.227 latencystats: explicitly disabled via build config 00:03:34.227 lpm: explicitly disabled via build config 00:03:34.227 member: explicitly disabled via build config 00:03:34.227 pcapng: explicitly disabled via build config 00:03:34.227 rawdev: explicitly disabled via build config 00:03:34.227 regexdev: explicitly disabled via build config 00:03:34.227 mldev: explicitly disabled via build config 00:03:34.227 rib: explicitly disabled via build config 00:03:34.227 sched: explicitly disabled via build config 00:03:34.227 stack: explicitly disabled via build config 00:03:34.227 ipsec: explicitly disabled via build config 00:03:34.227 pdcp: explicitly disabled via build config 00:03:34.227 fib: explicitly disabled via build config 00:03:34.227 port: explicitly disabled via build config 00:03:34.227 pdump: explicitly disabled via build config 00:03:34.227 table: explicitly disabled via build config 00:03:34.227 pipeline: explicitly disabled via build config 00:03:34.227 graph: explicitly disabled via build config 00:03:34.227 node: explicitly disabled via build config 00:03:34.227 00:03:34.227 drivers: 00:03:34.227 common/cpt: not in enabled drivers build config 00:03:34.227 common/dpaax: not in enabled drivers build config 00:03:34.227 common/iavf: not in enabled drivers build config 00:03:34.227 common/idpf: not in enabled drivers build config 00:03:34.227 common/ionic: not in enabled drivers build config 00:03:34.227 common/mvep: not in enabled drivers build config 00:03:34.227 common/octeontx: not in enabled drivers build config 00:03:34.227 bus/cdx: not in enabled drivers build config 00:03:34.227 bus/dpaa: not in enabled drivers build config 00:03:34.227 bus/fslmc: not in enabled drivers build config 00:03:34.227 bus/ifpga: not in enabled drivers build config 00:03:34.227 bus/platform: not in enabled drivers build config 00:03:34.227 bus/uacce: not in enabled drivers build config 00:03:34.227 bus/vmbus: not in enabled drivers build config 00:03:34.227 common/cnxk: not in enabled drivers build config 00:03:34.227 common/nfp: not in enabled drivers build config 00:03:34.227 common/nitrox: not in enabled drivers build config 00:03:34.228 common/sfc_efx: not in enabled drivers build config 00:03:34.228 mempool/bucket: not in enabled drivers build config 00:03:34.228 mempool/cnxk: not in enabled drivers build config 00:03:34.228 mempool/dpaa: not in enabled drivers build config 00:03:34.228 mempool/dpaa2: not in enabled drivers build config 00:03:34.228 mempool/octeontx: not in enabled drivers build config 00:03:34.228 mempool/stack: not in enabled drivers build config 00:03:34.228 dma/cnxk: not in enabled drivers build config 00:03:34.228 dma/dpaa: not in enabled drivers build config 00:03:34.228 dma/dpaa2: not in enabled drivers build config 00:03:34.228 dma/hisilicon: not in enabled drivers build config 00:03:34.228 dma/idxd: not in enabled drivers build config 00:03:34.228 dma/ioat: not in enabled drivers build config 00:03:34.228 dma/skeleton: not in enabled drivers build config 00:03:34.228 net/af_packet: not in enabled drivers build config 00:03:34.228 net/af_xdp: not in enabled drivers build config 00:03:34.228 net/ark: not in enabled drivers build config 00:03:34.228 net/atlantic: not in enabled drivers build config 00:03:34.228 net/avp: not in enabled drivers build config 00:03:34.228 net/axgbe: not in enabled drivers build config 00:03:34.228 net/bnx2x: not in enabled drivers build config 00:03:34.228 net/bnxt: not in enabled drivers build config 00:03:34.228 net/bonding: not in enabled drivers build config 00:03:34.228 net/cnxk: not in enabled drivers build config 00:03:34.228 net/cpfl: not in enabled drivers build config 00:03:34.228 net/cxgbe: not in enabled drivers build config 00:03:34.228 net/dpaa: not in enabled drivers build config 00:03:34.228 net/dpaa2: not in enabled drivers build config 00:03:34.228 net/e1000: not in enabled drivers build config 00:03:34.228 net/ena: not in enabled drivers build config 00:03:34.228 net/enetc: not in enabled drivers build config 00:03:34.228 net/enetfec: not in enabled drivers build config 00:03:34.228 net/enic: not in enabled drivers build config 00:03:34.228 net/failsafe: not in enabled drivers build config 00:03:34.228 net/fm10k: not in enabled drivers build config 00:03:34.228 net/gve: not in enabled drivers build config 00:03:34.228 net/hinic: not in enabled drivers build config 00:03:34.228 net/hns3: not in enabled drivers build config 00:03:34.228 net/i40e: not in enabled drivers build config 00:03:34.228 net/iavf: not in enabled drivers build config 00:03:34.228 net/ice: not in enabled drivers build config 00:03:34.228 net/idpf: not in enabled drivers build config 00:03:34.228 net/igc: not in enabled drivers build config 00:03:34.228 net/ionic: not in enabled drivers build config 00:03:34.228 net/ipn3ke: not in enabled drivers build config 00:03:34.228 net/ixgbe: not in enabled drivers build config 00:03:34.228 net/mana: not in enabled drivers build config 00:03:34.228 net/memif: not in enabled drivers build config 00:03:34.228 net/mlx4: not in enabled drivers build config 00:03:34.228 net/mlx5: not in enabled drivers build config 00:03:34.228 net/mvneta: not in enabled drivers build config 00:03:34.228 net/mvpp2: not in enabled drivers build config 00:03:34.228 net/netvsc: not in enabled drivers build config 00:03:34.228 net/nfb: not in enabled drivers build config 00:03:34.228 net/nfp: not in enabled drivers build config 00:03:34.228 net/ngbe: not in enabled drivers build config 00:03:34.228 net/null: not in enabled drivers build config 00:03:34.228 net/octeontx: not in enabled drivers build config 00:03:34.228 net/octeon_ep: not in enabled drivers build config 00:03:34.228 net/pcap: not in enabled drivers build config 00:03:34.228 net/pfe: not in enabled drivers build config 00:03:34.228 net/qede: not in enabled drivers build config 00:03:34.228 net/ring: not in enabled drivers build config 00:03:34.228 net/sfc: not in enabled drivers build config 00:03:34.228 net/softnic: not in enabled drivers build config 00:03:34.228 net/tap: not in enabled drivers build config 00:03:34.228 net/thunderx: not in enabled drivers build config 00:03:34.228 net/txgbe: not in enabled drivers build config 00:03:34.228 net/vdev_netvsc: not in enabled drivers build config 00:03:34.228 net/vhost: not in enabled drivers build config 00:03:34.228 net/virtio: not in enabled drivers build config 00:03:34.228 net/vmxnet3: not in enabled drivers build config 00:03:34.228 raw/*: missing internal dependency, "rawdev" 00:03:34.228 crypto/armv8: not in enabled drivers build config 00:03:34.228 crypto/bcmfs: not in enabled drivers build config 00:03:34.228 crypto/caam_jr: not in enabled drivers build config 00:03:34.228 crypto/ccp: not in enabled drivers build config 00:03:34.228 crypto/cnxk: not in enabled drivers build config 00:03:34.228 crypto/dpaa_sec: not in enabled drivers build config 00:03:34.228 crypto/dpaa2_sec: not in enabled drivers build config 00:03:34.228 crypto/mvsam: not in enabled drivers build config 00:03:34.228 crypto/nitrox: not in enabled drivers build config 00:03:34.228 crypto/null: not in enabled drivers build config 00:03:34.228 crypto/octeontx: not in enabled drivers build config 00:03:34.228 crypto/openssl: not in enabled drivers build config 00:03:34.228 crypto/scheduler: not in enabled drivers build config 00:03:34.228 crypto/uadk: not in enabled drivers build config 00:03:34.228 crypto/virtio: not in enabled drivers build config 00:03:34.228 compress/nitrox: not in enabled drivers build config 00:03:34.228 compress/octeontx: not in enabled drivers build config 00:03:34.228 compress/zlib: not in enabled drivers build config 00:03:34.228 regex/*: missing internal dependency, "regexdev" 00:03:34.228 ml/*: missing internal dependency, "mldev" 00:03:34.228 vdpa/ifc: not in enabled drivers build config 00:03:34.228 vdpa/mlx5: not in enabled drivers build config 00:03:34.228 vdpa/nfp: not in enabled drivers build config 00:03:34.228 vdpa/sfc: not in enabled drivers build config 00:03:34.228 event/*: missing internal dependency, "eventdev" 00:03:34.228 baseband/*: missing internal dependency, "bbdev" 00:03:34.228 gpu/*: missing internal dependency, "gpudev" 00:03:34.228 00:03:34.228 00:03:34.795 Build targets in project: 115 00:03:34.795 00:03:34.795 DPDK 24.03.0 00:03:34.795 00:03:34.795 User defined options 00:03:34.795 buildtype : debug 00:03:34.795 default_library : shared 00:03:34.795 libdir : lib 00:03:34.795 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:03:34.795 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:03:34.795 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:03:34.795 cpu_instruction_set: native 00:03:34.795 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:03:34.795 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:03:34.795 enable_docs : false 00:03:34.795 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:03:34.795 enable_kmods : false 00:03:34.795 max_lcores : 128 00:03:34.795 tests : false 00:03:34.795 00:03:34.795 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:35.372 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:03:35.372 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:03:35.372 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:03:35.372 [3/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:03:35.372 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:03:35.372 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:03:35.372 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:03:35.372 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:03:35.372 [8/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:03:35.372 [9/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:03:35.372 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:03:35.372 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:03:35.372 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:03:35.372 [13/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:03:35.372 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:03:35.372 [15/378] Linking static target lib/librte_kvargs.a 00:03:35.372 [16/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:03:35.634 [17/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:03:35.634 [18/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:03:35.634 [19/378] Linking static target lib/librte_log.a 00:03:35.893 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:03:35.893 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:03:35.893 [22/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:03:35.893 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:03:35.893 [24/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:03:35.893 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:03:35.893 [26/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:03:35.893 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:03:35.893 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:03:35.893 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:03:35.893 [30/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:03:35.893 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:03:35.893 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:03:35.893 [33/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:03:35.893 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:03:35.893 [35/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:03:35.893 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:03:35.893 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:03:35.893 [38/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:03:35.893 [39/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:03:35.893 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:03:35.893 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:03:35.893 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:03:35.893 [43/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:03:35.893 [44/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:03:36.159 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:03:36.159 [46/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:03:36.159 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:03:36.159 [48/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:03:36.159 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:03:36.159 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:03:36.159 [51/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:03:36.159 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:03:36.159 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:03:36.159 [54/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:03:36.159 [55/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:03:36.159 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:03:36.159 [57/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:03:36.159 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:03:36.159 [59/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:03:36.159 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:03:36.159 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:03:36.159 [62/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:03:36.159 [63/378] Linking static target lib/librte_telemetry.a 00:03:36.159 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:03:36.159 [65/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:03:36.159 [66/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:03:36.159 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:03:36.159 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:03:36.160 [69/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:03:36.160 [70/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:03:36.160 [71/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:03:36.160 [72/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:03:36.160 [73/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:03:36.160 [74/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:03:36.160 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:03:36.160 [76/378] Linking static target lib/librte_pci.a 00:03:36.160 [77/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:03:36.160 [78/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:03:36.160 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:03:36.160 [80/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:03:36.160 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:03:36.160 [82/378] Linking static target lib/librte_ring.a 00:03:36.160 [83/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:03:36.160 [84/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:03:36.160 [85/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:03:36.160 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:03:36.160 [87/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:03:36.160 [88/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:03:36.160 [89/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:03:36.160 [90/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:03:36.160 [91/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:03:36.160 [92/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:03:36.160 [93/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:03:36.160 [94/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:03:36.160 [95/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:03:36.160 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:03:36.160 [97/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:03:36.160 [98/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:03:36.160 [99/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:03:36.160 [100/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:03:36.160 [101/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:03:36.160 [102/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:03:36.421 [103/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:03:36.421 [104/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:03:36.421 [105/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:03:36.421 [106/378] Linking static target lib/librte_net.a 00:03:36.421 [107/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:03:36.421 [108/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:03:36.421 [109/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:03:36.421 [110/378] Linking static target lib/librte_mempool.a 00:03:36.421 [111/378] Linking static target lib/librte_rcu.a 00:03:36.421 [112/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.421 [113/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:03:36.421 [114/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:03:36.421 [115/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:03:36.421 [116/378] Linking target lib/librte_log.so.24.1 00:03:36.421 [117/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:03:36.421 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:03:36.421 [119/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:03:36.421 [120/378] Linking static target lib/librte_mbuf.a 00:03:36.421 [121/378] Linking static target lib/librte_meter.a 00:03:36.421 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:03:36.421 [123/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.679 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:03:36.679 [125/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:03:36.679 [126/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:03:36.679 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:03:36.679 [128/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.679 [129/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:03:36.679 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:03:36.679 [131/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:03:36.679 [132/378] Linking target lib/librte_kvargs.so.24.1 00:03:36.679 [133/378] Linking static target lib/librte_cmdline.a 00:03:36.679 [134/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:03:36.679 [135/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:03:36.679 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:03:36.679 [137/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:03:36.679 [138/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:03:36.679 [139/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:03:36.679 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:03:36.679 [141/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:03:36.948 [142/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:03:36.948 [143/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:03:36.948 [144/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:03:36.948 [145/378] Linking static target lib/librte_timer.a 00:03:36.948 [146/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:03:36.948 [147/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.948 [148/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:03:36.948 [149/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:03:36.948 [150/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:03:36.948 [151/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:03:36.948 [152/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:03:36.948 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:03:36.948 [154/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:03:36.948 [155/378] Linking static target lib/librte_eal.a 00:03:36.948 [156/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:03:36.948 [157/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:03:36.948 [158/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:03:36.948 [159/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:03:36.948 [160/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.948 [161/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:03:36.948 [162/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:03:36.948 [163/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:03:36.948 [164/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:03:36.948 [165/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:03:36.948 [166/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:03:36.948 [167/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.948 [168/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:03:36.948 [169/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:03:36.948 [170/378] Linking target lib/librte_telemetry.so.24.1 00:03:36.948 [171/378] Linking static target lib/librte_compressdev.a 00:03:36.948 [172/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:03:36.948 [173/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:03:36.948 [174/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:03:36.948 [175/378] Linking static target lib/librte_dmadev.a 00:03:36.948 [176/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:03:36.948 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:03:36.948 [178/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:03:36.948 [179/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:03:36.948 [180/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:03:36.948 [181/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:03:36.948 [182/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:03:37.224 [183/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:03:37.224 [184/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:03:37.224 [185/378] Linking static target lib/librte_reorder.a 00:03:37.224 [186/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:03:37.224 [187/378] Linking static target lib/librte_power.a 00:03:37.224 [188/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:03:37.224 [189/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:03:37.224 [190/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:03:37.224 [191/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:03:37.224 [192/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:03:37.224 [193/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:03:37.224 [194/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:03:37.224 [195/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:03:37.224 [196/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:03:37.224 [197/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:03:37.224 [198/378] Linking static target lib/librte_security.a 00:03:37.224 [199/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:03:37.224 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:03:37.224 [201/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:37.224 [202/378] Linking static target drivers/librte_bus_auxiliary.a 00:03:37.224 [203/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:37.496 [204/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:03:37.496 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:03:37.496 [206/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:03:37.496 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:03:37.496 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:03:37.496 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:03:37.496 [210/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:03:37.497 [211/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:03:37.497 [212/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:03:37.497 [213/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:03:37.497 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:03:37.497 [215/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:37.497 [216/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:37.497 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:03:37.497 [218/378] Linking static target drivers/librte_bus_vdev.a 00:03:37.497 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:03:37.497 [220/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.497 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:03:37.497 [222/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.497 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:03:37.497 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:03:37.497 [225/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:03:37.497 [226/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:03:37.497 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:03:37.497 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:03:37.497 [229/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:03:37.497 [230/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:37.497 [231/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:37.497 [232/378] Linking static target lib/librte_hash.a 00:03:37.497 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:03:37.497 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:03:37.497 [235/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.497 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:03:37.497 [237/378] Linking static target drivers/librte_bus_pci.a 00:03:37.497 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:03:37.497 [239/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.757 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:03:37.757 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:03:37.757 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:03:37.757 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:03:37.757 [244/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.757 [245/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:03:37.757 [246/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:03:37.757 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:03:37.757 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:03:37.757 [249/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:03:37.757 [250/378] Linking static target lib/librte_cryptodev.a 00:03:37.757 [251/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:03:37.757 [252/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.757 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:03:37.757 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:03:37.757 [255/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.757 [256/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:03:37.757 [257/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:03:37.757 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:03:37.757 [259/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:03:37.757 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:03:37.757 [261/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:03:37.757 [262/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:38.016 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:03:38.016 [264/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:03:38.016 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:03:38.016 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:03:38.016 [267/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:03:38.016 [268/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:03:38.016 [269/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:03:38.016 [270/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:03:38.016 [271/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:03:38.016 [272/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:03:38.016 [273/378] Linking static target lib/librte_ethdev.a 00:03:38.016 [274/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:03:38.016 [275/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:03:38.016 [276/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:38.016 [277/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:03:38.016 [278/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:03:38.016 [279/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:38.016 [280/378] Linking static target drivers/librte_mempool_ring.a 00:03:38.016 [281/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:03:38.016 [282/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:03:38.016 [283/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:03:38.016 [284/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:03:38.016 [285/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:03:38.275 [286/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:03:38.275 [287/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:03:38.275 [288/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:03:38.275 [289/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:03:38.275 [290/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:03:38.275 [291/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:03:38.275 [292/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:03:38.275 [293/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:03:38.275 [294/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:03:38.275 [295/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:38.275 [296/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:03:38.275 [297/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:38.275 [298/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:38.275 [299/378] Linking static target drivers/librte_compress_mlx5.a 00:03:38.275 [300/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:03:38.275 [301/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:03:38.533 [302/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:38.533 [303/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:38.533 [304/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:03:38.533 [305/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:38.533 [306/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:38.533 [307/378] Linking static target drivers/librte_common_mlx5.a 00:03:38.533 [308/378] Linking static target drivers/librte_crypto_mlx5.a 00:03:38.533 [309/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:03:38.534 [310/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:03:38.534 [311/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:03:38.534 [312/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:38.534 [313/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:38.534 [314/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:03:38.534 [315/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:38.534 [316/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:38.534 [317/378] Linking static target drivers/librte_compress_isal.a 00:03:39.102 [318/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:03:39.361 [319/378] Linking static target lib/librte_vhost.a 00:03:39.361 [320/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:03:39.361 [321/378] Linking static target drivers/libtmp_rte_common_qat.a 00:03:39.620 [322/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:03:39.620 [323/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:39.620 [324/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:39.878 [325/378] Linking static target drivers/librte_common_qat.a 00:03:39.878 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:41.782 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:03:44.312 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:03:47.596 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.591 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.591 [331/378] Linking target lib/librte_eal.so.24.1 00:03:49.591 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:03:49.591 [333/378] Linking target lib/librte_pci.so.24.1 00:03:49.591 [334/378] Linking target drivers/librte_bus_vdev.so.24.1 00:03:49.591 [335/378] Linking target lib/librte_ring.so.24.1 00:03:49.591 [336/378] Linking target lib/librte_meter.so.24.1 00:03:49.591 [337/378] Linking target lib/librte_dmadev.so.24.1 00:03:49.591 [338/378] Linking target lib/librte_timer.so.24.1 00:03:49.591 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:03:49.591 [340/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:03:49.591 [341/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:03:49.591 [342/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:03:49.591 [343/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:03:49.591 [344/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:03:49.850 [345/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:03:49.850 [346/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:03:49.850 [347/378] Linking target lib/librte_rcu.so.24.1 00:03:49.850 [348/378] Linking target lib/librte_mempool.so.24.1 00:03:49.850 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:03:49.850 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:03:49.850 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:03:49.850 [352/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:03:50.109 [353/378] Linking target lib/librte_mbuf.so.24.1 00:03:50.109 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:03:50.367 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:03:50.367 [356/378] Linking target lib/librte_reorder.so.24.1 00:03:50.367 [357/378] Linking target lib/librte_net.so.24.1 00:03:50.367 [358/378] Linking target lib/librte_compressdev.so.24.1 00:03:50.367 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:03:50.367 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:03:50.367 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:03:50.367 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:03:50.626 [363/378] Linking target lib/librte_security.so.24.1 00:03:50.626 [364/378] Linking target lib/librte_hash.so.24.1 00:03:50.626 [365/378] Linking target lib/librte_cmdline.so.24.1 00:03:50.626 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:03:50.626 [367/378] Linking target lib/librte_ethdev.so.24.1 00:03:50.626 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:03:50.626 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:03:50.626 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:03:50.884 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:03:50.884 [372/378] Linking target lib/librte_power.so.24.1 00:03:50.884 [373/378] Linking target lib/librte_vhost.so.24.1 00:03:50.884 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:03:50.884 [375/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:03:50.885 [376/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:03:51.143 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:03:51.143 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:03:51.144 INFO: autodetecting backend as ninja 00:03:51.144 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:03:52.527 CC lib/ut_mock/mock.o 00:03:52.527 CC lib/log/log.o 00:03:52.527 CC lib/log/log_flags.o 00:03:52.527 CC lib/log/log_deprecated.o 00:03:52.527 CC lib/ut/ut.o 00:03:52.527 LIB libspdk_ut_mock.a 00:03:52.527 LIB libspdk_log.a 00:03:52.527 LIB libspdk_ut.a 00:03:52.527 SO libspdk_ut_mock.so.6.0 00:03:52.527 SO libspdk_log.so.7.0 00:03:52.527 SO libspdk_ut.so.2.0 00:03:52.527 SYMLINK libspdk_ut_mock.so 00:03:52.527 SYMLINK libspdk_log.so 00:03:52.527 SYMLINK libspdk_ut.so 00:03:52.783 CC lib/util/base64.o 00:03:52.784 CC lib/util/bit_array.o 00:03:52.784 CC lib/util/cpuset.o 00:03:52.784 CC lib/util/crc16.o 00:03:52.784 CC lib/util/crc32c.o 00:03:52.784 CC lib/util/crc32.o 00:03:52.784 CC lib/util/crc32_ieee.o 00:03:52.784 CC lib/util/crc64.o 00:03:52.784 CC lib/util/file.o 00:03:52.784 CC lib/util/fd.o 00:03:52.784 CC lib/util/dif.o 00:03:52.784 CC lib/util/hexlify.o 00:03:52.784 CC lib/util/iov.o 00:03:52.784 CC lib/util/math.o 00:03:52.784 CC lib/util/pipe.o 00:03:52.784 CC lib/util/strerror_tls.o 00:03:52.784 CC lib/util/string.o 00:03:52.784 CC lib/util/fd_group.o 00:03:52.784 CC lib/ioat/ioat.o 00:03:52.784 CC lib/util/uuid.o 00:03:52.784 CC lib/util/xor.o 00:03:52.784 CXX lib/trace_parser/trace.o 00:03:52.784 CC lib/util/zipf.o 00:03:53.041 CC lib/dma/dma.o 00:03:53.041 CC lib/vfio_user/host/vfio_user_pci.o 00:03:53.041 CC lib/vfio_user/host/vfio_user.o 00:03:53.041 LIB libspdk_dma.a 00:03:53.298 SO libspdk_dma.so.4.0 00:03:53.298 LIB libspdk_ioat.a 00:03:53.298 SO libspdk_ioat.so.7.0 00:03:53.298 SYMLINK libspdk_dma.so 00:03:53.298 SYMLINK libspdk_ioat.so 00:03:53.298 LIB libspdk_vfio_user.a 00:03:53.555 SO libspdk_vfio_user.so.5.0 00:03:53.555 LIB libspdk_util.a 00:03:53.555 SYMLINK libspdk_vfio_user.so 00:03:53.555 SO libspdk_util.so.9.1 00:03:53.811 SYMLINK libspdk_util.so 00:03:53.811 LIB libspdk_trace_parser.a 00:03:53.811 SO libspdk_trace_parser.so.5.0 00:03:54.069 SYMLINK libspdk_trace_parser.so 00:03:54.069 CC lib/rdma_provider/common.o 00:03:54.069 CC lib/conf/conf.o 00:03:54.069 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:54.069 CC lib/rdma_utils/rdma_utils.o 00:03:54.069 CC lib/json/json_parse.o 00:03:54.069 CC lib/json/json_util.o 00:03:54.069 CC lib/env_dpdk/env.o 00:03:54.069 CC lib/json/json_write.o 00:03:54.069 CC lib/env_dpdk/memory.o 00:03:54.069 CC lib/env_dpdk/pci.o 00:03:54.069 CC lib/idxd/idxd_user.o 00:03:54.069 CC lib/idxd/idxd.o 00:03:54.069 CC lib/idxd/idxd_kernel.o 00:03:54.069 CC lib/env_dpdk/init.o 00:03:54.069 CC lib/env_dpdk/threads.o 00:03:54.069 CC lib/env_dpdk/pci_ioat.o 00:03:54.069 CC lib/env_dpdk/pci_vmd.o 00:03:54.069 CC lib/env_dpdk/pci_virtio.o 00:03:54.069 CC lib/env_dpdk/pci_idxd.o 00:03:54.069 CC lib/env_dpdk/pci_event.o 00:03:54.069 CC lib/env_dpdk/pci_dpdk.o 00:03:54.069 CC lib/env_dpdk/sigbus_handler.o 00:03:54.069 CC lib/vmd/vmd.o 00:03:54.069 CC lib/reduce/reduce.o 00:03:54.069 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:54.069 CC lib/vmd/led.o 00:03:54.069 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:54.327 LIB libspdk_rdma_provider.a 00:03:54.327 LIB libspdk_conf.a 00:03:54.327 SO libspdk_rdma_provider.so.6.0 00:03:54.327 LIB libspdk_rdma_utils.a 00:03:54.327 SO libspdk_conf.so.6.0 00:03:54.327 SYMLINK libspdk_rdma_provider.so 00:03:54.327 SO libspdk_rdma_utils.so.1.0 00:03:54.586 SYMLINK libspdk_conf.so 00:03:54.586 SYMLINK libspdk_rdma_utils.so 00:03:54.586 LIB libspdk_json.a 00:03:54.586 SO libspdk_json.so.6.0 00:03:54.586 LIB libspdk_idxd.a 00:03:54.844 SO libspdk_idxd.so.12.0 00:03:54.844 SYMLINK libspdk_json.so 00:03:54.844 LIB libspdk_vmd.a 00:03:54.844 LIB libspdk_reduce.a 00:03:54.844 SYMLINK libspdk_idxd.so 00:03:54.844 SO libspdk_reduce.so.6.0 00:03:54.844 SO libspdk_vmd.so.6.0 00:03:54.844 SYMLINK libspdk_reduce.so 00:03:54.844 SYMLINK libspdk_vmd.so 00:03:55.102 CC lib/jsonrpc/jsonrpc_server.o 00:03:55.102 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:55.102 CC lib/jsonrpc/jsonrpc_client.o 00:03:55.102 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:55.102 LIB libspdk_env_dpdk.a 00:03:55.361 SO libspdk_env_dpdk.so.14.1 00:03:55.361 LIB libspdk_jsonrpc.a 00:03:55.361 SYMLINK libspdk_env_dpdk.so 00:03:55.361 SO libspdk_jsonrpc.so.6.0 00:03:55.620 SYMLINK libspdk_jsonrpc.so 00:03:55.879 CC lib/rpc/rpc.o 00:03:56.138 LIB libspdk_rpc.a 00:03:56.138 SO libspdk_rpc.so.6.0 00:03:56.138 SYMLINK libspdk_rpc.so 00:03:56.396 CC lib/trace/trace_flags.o 00:03:56.396 CC lib/trace/trace.o 00:03:56.396 CC lib/trace/trace_rpc.o 00:03:56.654 CC lib/notify/notify.o 00:03:56.654 CC lib/notify/notify_rpc.o 00:03:56.654 CC lib/keyring/keyring.o 00:03:56.654 CC lib/keyring/keyring_rpc.o 00:03:56.654 LIB libspdk_notify.a 00:03:56.654 SO libspdk_notify.so.6.0 00:03:56.913 LIB libspdk_trace.a 00:03:56.913 LIB libspdk_keyring.a 00:03:56.913 SO libspdk_trace.so.10.0 00:03:56.913 SYMLINK libspdk_notify.so 00:03:56.913 SO libspdk_keyring.so.1.0 00:03:56.913 SYMLINK libspdk_trace.so 00:03:56.913 SYMLINK libspdk_keyring.so 00:03:57.171 CC lib/thread/thread.o 00:03:57.171 CC lib/thread/iobuf.o 00:03:57.429 CC lib/sock/sock.o 00:03:57.429 CC lib/sock/sock_rpc.o 00:03:57.688 LIB libspdk_sock.a 00:03:57.688 SO libspdk_sock.so.10.0 00:03:57.947 SYMLINK libspdk_sock.so 00:03:58.205 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:58.205 CC lib/nvme/nvme_ctrlr.o 00:03:58.205 CC lib/nvme/nvme_fabric.o 00:03:58.205 CC lib/nvme/nvme_ns.o 00:03:58.205 CC lib/nvme/nvme_ns_cmd.o 00:03:58.205 CC lib/nvme/nvme_pcie_common.o 00:03:58.205 CC lib/nvme/nvme_pcie.o 00:03:58.205 CC lib/nvme/nvme_quirks.o 00:03:58.205 CC lib/nvme/nvme_qpair.o 00:03:58.205 CC lib/nvme/nvme.o 00:03:58.205 CC lib/nvme/nvme_transport.o 00:03:58.205 CC lib/nvme/nvme_discovery.o 00:03:58.205 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:58.205 CC lib/nvme/nvme_tcp.o 00:03:58.205 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:58.205 CC lib/nvme/nvme_opal.o 00:03:58.205 CC lib/nvme/nvme_io_msg.o 00:03:58.205 CC lib/nvme/nvme_poll_group.o 00:03:58.205 CC lib/nvme/nvme_zns.o 00:03:58.205 CC lib/nvme/nvme_stubs.o 00:03:58.205 CC lib/nvme/nvme_auth.o 00:03:58.205 CC lib/nvme/nvme_cuse.o 00:03:58.205 CC lib/nvme/nvme_rdma.o 00:03:58.772 LIB libspdk_thread.a 00:03:58.772 SO libspdk_thread.so.10.1 00:03:59.029 SYMLINK libspdk_thread.so 00:03:59.286 CC lib/blob/blobstore.o 00:03:59.286 CC lib/blob/request.o 00:03:59.286 CC lib/blob/blob_bs_dev.o 00:03:59.286 CC lib/blob/zeroes.o 00:03:59.286 CC lib/accel/accel.o 00:03:59.286 CC lib/accel/accel_rpc.o 00:03:59.286 CC lib/accel/accel_sw.o 00:03:59.286 CC lib/init/json_config.o 00:03:59.286 CC lib/init/subsystem.o 00:03:59.286 CC lib/init/subsystem_rpc.o 00:03:59.286 CC lib/init/rpc.o 00:03:59.286 CC lib/virtio/virtio_vhost_user.o 00:03:59.286 CC lib/virtio/virtio.o 00:03:59.286 CC lib/virtio/virtio_pci.o 00:03:59.286 CC lib/virtio/virtio_vfio_user.o 00:03:59.545 LIB libspdk_init.a 00:03:59.545 SO libspdk_init.so.5.0 00:03:59.545 LIB libspdk_virtio.a 00:03:59.545 SYMLINK libspdk_init.so 00:03:59.545 SO libspdk_virtio.so.7.0 00:03:59.805 SYMLINK libspdk_virtio.so 00:04:00.064 CC lib/event/app.o 00:04:00.064 CC lib/event/reactor.o 00:04:00.064 CC lib/event/log_rpc.o 00:04:00.064 CC lib/event/app_rpc.o 00:04:00.064 CC lib/event/scheduler_static.o 00:04:00.323 LIB libspdk_accel.a 00:04:00.323 SO libspdk_accel.so.15.1 00:04:00.323 LIB libspdk_event.a 00:04:00.323 LIB libspdk_nvme.a 00:04:00.323 SYMLINK libspdk_accel.so 00:04:00.609 SO libspdk_event.so.14.0 00:04:00.609 SYMLINK libspdk_event.so 00:04:00.609 SO libspdk_nvme.so.13.1 00:04:00.868 CC lib/bdev/bdev.o 00:04:00.868 CC lib/bdev/bdev_rpc.o 00:04:00.868 CC lib/bdev/bdev_zone.o 00:04:00.868 CC lib/bdev/part.o 00:04:00.868 CC lib/bdev/scsi_nvme.o 00:04:00.868 SYMLINK libspdk_nvme.so 00:04:02.244 LIB libspdk_blob.a 00:04:02.503 SO libspdk_blob.so.11.0 00:04:02.503 SYMLINK libspdk_blob.so 00:04:02.760 CC lib/lvol/lvol.o 00:04:02.760 CC lib/blobfs/blobfs.o 00:04:02.760 CC lib/blobfs/tree.o 00:04:03.326 LIB libspdk_bdev.a 00:04:03.583 SO libspdk_bdev.so.15.1 00:04:03.583 SYMLINK libspdk_bdev.so 00:04:03.583 LIB libspdk_blobfs.a 00:04:03.844 SO libspdk_blobfs.so.10.0 00:04:03.844 LIB libspdk_lvol.a 00:04:03.844 SYMLINK libspdk_blobfs.so 00:04:03.844 SO libspdk_lvol.so.10.0 00:04:03.844 SYMLINK libspdk_lvol.so 00:04:03.844 CC lib/scsi/dev.o 00:04:03.844 CC lib/scsi/port.o 00:04:03.844 CC lib/scsi/lun.o 00:04:03.844 CC lib/scsi/scsi_bdev.o 00:04:03.844 CC lib/scsi/scsi.o 00:04:03.844 CC lib/scsi/scsi_pr.o 00:04:03.844 CC lib/scsi/scsi_rpc.o 00:04:03.844 CC lib/ftl/ftl_core.o 00:04:03.844 CC lib/scsi/task.o 00:04:03.844 CC lib/ftl/ftl_init.o 00:04:03.844 CC lib/ftl/ftl_layout.o 00:04:03.844 CC lib/ublk/ublk.o 00:04:03.844 CC lib/ftl/ftl_debug.o 00:04:03.844 CC lib/ublk/ublk_rpc.o 00:04:03.844 CC lib/ftl/ftl_l2p.o 00:04:03.844 CC lib/ftl/ftl_io.o 00:04:03.844 CC lib/ftl/ftl_sb.o 00:04:03.844 CC lib/nvmf/ctrlr.o 00:04:03.844 CC lib/nvmf/ctrlr_discovery.o 00:04:03.844 CC lib/ftl/ftl_l2p_flat.o 00:04:03.844 CC lib/nvmf/ctrlr_bdev.o 00:04:03.844 CC lib/ftl/ftl_nv_cache.o 00:04:03.844 CC lib/nvmf/subsystem.o 00:04:03.844 CC lib/nvmf/nvmf.o 00:04:03.844 CC lib/ftl/ftl_band.o 00:04:03.844 CC lib/nbd/nbd.o 00:04:03.844 CC lib/ftl/ftl_rq.o 00:04:03.844 CC lib/nvmf/nvmf_rpc.o 00:04:03.844 CC lib/ftl/ftl_band_ops.o 00:04:03.844 CC lib/nbd/nbd_rpc.o 00:04:03.844 CC lib/nvmf/transport.o 00:04:03.844 CC lib/ftl/ftl_writer.o 00:04:03.844 CC lib/nvmf/tcp.o 00:04:03.844 CC lib/nvmf/stubs.o 00:04:03.844 CC lib/nvmf/mdns_server.o 00:04:03.844 CC lib/ftl/ftl_reloc.o 00:04:03.844 CC lib/ftl/ftl_l2p_cache.o 00:04:03.844 CC lib/ftl/ftl_p2l.o 00:04:03.844 CC lib/nvmf/rdma.o 00:04:03.844 CC lib/nvmf/auth.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:03.844 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:03.844 CC lib/ftl/utils/ftl_conf.o 00:04:03.844 CC lib/ftl/utils/ftl_md.o 00:04:03.844 CC lib/ftl/utils/ftl_mempool.o 00:04:03.844 CC lib/ftl/utils/ftl_bitmap.o 00:04:03.844 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:03.844 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:03.844 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:03.844 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:03.844 CC lib/ftl/utils/ftl_property.o 00:04:03.844 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:03.844 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:03.844 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:03.844 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:03.844 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:03.844 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:03.844 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:03.844 CC lib/ftl/base/ftl_base_dev.o 00:04:04.106 CC lib/ftl/ftl_trace.o 00:04:04.106 CC lib/ftl/base/ftl_base_bdev.o 00:04:04.673 LIB libspdk_nbd.a 00:04:04.674 SO libspdk_nbd.so.7.0 00:04:04.674 SYMLINK libspdk_nbd.so 00:04:04.933 LIB libspdk_ublk.a 00:04:04.933 SO libspdk_ublk.so.3.0 00:04:04.933 LIB libspdk_scsi.a 00:04:04.933 SYMLINK libspdk_ublk.so 00:04:04.933 SO libspdk_scsi.so.9.0 00:04:05.192 SYMLINK libspdk_scsi.so 00:04:05.192 LIB libspdk_ftl.a 00:04:05.451 SO libspdk_ftl.so.9.0 00:04:05.451 CC lib/iscsi/conn.o 00:04:05.451 CC lib/iscsi/init_grp.o 00:04:05.451 CC lib/iscsi/iscsi.o 00:04:05.451 CC lib/iscsi/md5.o 00:04:05.451 CC lib/iscsi/param.o 00:04:05.451 CC lib/iscsi/portal_grp.o 00:04:05.451 CC lib/iscsi/tgt_node.o 00:04:05.451 CC lib/iscsi/iscsi_subsystem.o 00:04:05.451 CC lib/iscsi/iscsi_rpc.o 00:04:05.451 CC lib/iscsi/task.o 00:04:05.451 CC lib/vhost/vhost.o 00:04:05.451 CC lib/vhost/vhost_rpc.o 00:04:05.451 CC lib/vhost/vhost_scsi.o 00:04:05.451 CC lib/vhost/vhost_blk.o 00:04:05.451 CC lib/vhost/rte_vhost_user.o 00:04:05.708 SYMLINK libspdk_ftl.so 00:04:06.275 LIB libspdk_nvmf.a 00:04:06.275 SO libspdk_nvmf.so.18.1 00:04:06.534 SYMLINK libspdk_nvmf.so 00:04:06.534 LIB libspdk_vhost.a 00:04:06.534 SO libspdk_vhost.so.8.0 00:04:06.792 SYMLINK libspdk_vhost.so 00:04:06.792 LIB libspdk_iscsi.a 00:04:07.049 SO libspdk_iscsi.so.8.0 00:04:07.049 SYMLINK libspdk_iscsi.so 00:04:07.613 CC module/env_dpdk/env_dpdk_rpc.o 00:04:07.872 CC module/keyring/file/keyring_rpc.o 00:04:07.872 CC module/keyring/file/keyring.o 00:04:07.872 CC module/keyring/linux/keyring_rpc.o 00:04:07.872 CC module/keyring/linux/keyring.o 00:04:07.872 CC module/scheduler/gscheduler/gscheduler.o 00:04:07.872 CC module/accel/error/accel_error.o 00:04:07.872 CC module/accel/error/accel_error_rpc.o 00:04:07.872 CC module/sock/posix/posix.o 00:04:07.872 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:07.872 CC module/accel/ioat/accel_ioat.o 00:04:07.872 CC module/accel/ioat/accel_ioat_rpc.o 00:04:07.872 CC module/accel/iaa/accel_iaa.o 00:04:07.872 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:04:07.872 CC module/accel/iaa/accel_iaa_rpc.o 00:04:07.872 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:07.872 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:04:07.872 CC module/accel/dsa/accel_dsa.o 00:04:07.872 CC module/accel/dsa/accel_dsa_rpc.o 00:04:07.872 CC module/blob/bdev/blob_bdev.o 00:04:07.872 LIB libspdk_env_dpdk_rpc.a 00:04:07.872 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:04:07.872 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:04:07.872 SO libspdk_env_dpdk_rpc.so.6.0 00:04:07.872 SYMLINK libspdk_env_dpdk_rpc.so 00:04:08.132 LIB libspdk_keyring_linux.a 00:04:08.132 LIB libspdk_keyring_file.a 00:04:08.132 LIB libspdk_scheduler_gscheduler.a 00:04:08.132 SO libspdk_keyring_file.so.1.0 00:04:08.132 SO libspdk_keyring_linux.so.1.0 00:04:08.132 LIB libspdk_accel_iaa.a 00:04:08.132 LIB libspdk_scheduler_dpdk_governor.a 00:04:08.132 LIB libspdk_accel_error.a 00:04:08.132 SO libspdk_scheduler_gscheduler.so.4.0 00:04:08.132 LIB libspdk_scheduler_dynamic.a 00:04:08.132 SO libspdk_scheduler_dpdk_governor.so.4.0 00:04:08.132 SO libspdk_accel_error.so.2.0 00:04:08.132 SO libspdk_accel_iaa.so.3.0 00:04:08.132 SYMLINK libspdk_keyring_file.so 00:04:08.132 SYMLINK libspdk_keyring_linux.so 00:04:08.132 SO libspdk_scheduler_dynamic.so.4.0 00:04:08.132 SYMLINK libspdk_scheduler_gscheduler.so 00:04:08.132 LIB libspdk_accel_ioat.a 00:04:08.132 LIB libspdk_accel_dsa.a 00:04:08.132 LIB libspdk_blob_bdev.a 00:04:08.132 SYMLINK libspdk_accel_error.so 00:04:08.132 SYMLINK libspdk_scheduler_dpdk_governor.so 00:04:08.132 SYMLINK libspdk_scheduler_dynamic.so 00:04:08.132 SYMLINK libspdk_accel_iaa.so 00:04:08.132 SO libspdk_accel_ioat.so.6.0 00:04:08.132 SO libspdk_accel_dsa.so.5.0 00:04:08.132 SO libspdk_blob_bdev.so.11.0 00:04:08.419 SYMLINK libspdk_accel_ioat.so 00:04:08.419 SYMLINK libspdk_blob_bdev.so 00:04:08.419 SYMLINK libspdk_accel_dsa.so 00:04:08.677 LIB libspdk_sock_posix.a 00:04:08.677 SO libspdk_sock_posix.so.6.0 00:04:08.677 SYMLINK libspdk_sock_posix.so 00:04:08.677 CC module/bdev/nvme/bdev_nvme.o 00:04:08.677 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:08.677 CC module/blobfs/bdev/blobfs_bdev.o 00:04:08.677 CC module/bdev/nvme/vbdev_opal.o 00:04:08.677 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:08.677 CC module/bdev/nvme/bdev_mdns_client.o 00:04:08.677 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:08.677 CC module/bdev/delay/vbdev_delay.o 00:04:08.677 CC module/bdev/nvme/nvme_rpc.o 00:04:08.677 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:08.677 CC module/bdev/lvol/vbdev_lvol.o 00:04:08.677 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:08.677 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:08.677 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:08.677 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:08.677 CC module/bdev/raid/bdev_raid_rpc.o 00:04:08.677 CC module/bdev/raid/bdev_raid.o 00:04:08.677 CC module/bdev/raid/bdev_raid_sb.o 00:04:08.677 CC module/bdev/split/vbdev_split.o 00:04:08.677 CC module/bdev/raid/raid1.o 00:04:08.677 CC module/bdev/gpt/gpt.o 00:04:08.677 CC module/bdev/raid/concat.o 00:04:08.677 CC module/bdev/malloc/bdev_malloc.o 00:04:08.677 CC module/bdev/raid/raid0.o 00:04:08.677 CC module/bdev/gpt/vbdev_gpt.o 00:04:08.677 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:08.677 CC module/bdev/split/vbdev_split_rpc.o 00:04:08.677 CC module/bdev/error/vbdev_error.o 00:04:08.677 CC module/bdev/error/vbdev_error_rpc.o 00:04:08.677 CC module/bdev/passthru/vbdev_passthru.o 00:04:08.677 CC module/bdev/iscsi/bdev_iscsi.o 00:04:08.677 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:08.677 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:08.677 CC module/bdev/compress/vbdev_compress.o 00:04:08.677 CC module/bdev/compress/vbdev_compress_rpc.o 00:04:08.677 CC module/bdev/crypto/vbdev_crypto.o 00:04:08.677 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:04:08.677 CC module/bdev/null/bdev_null_rpc.o 00:04:08.677 CC module/bdev/null/bdev_null.o 00:04:08.677 CC module/bdev/ftl/bdev_ftl.o 00:04:08.677 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:08.677 CC module/bdev/aio/bdev_aio.o 00:04:08.677 CC module/bdev/aio/bdev_aio_rpc.o 00:04:08.677 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:08.677 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:08.677 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:08.935 LIB libspdk_blobfs_bdev.a 00:04:08.935 LIB libspdk_bdev_error.a 00:04:08.935 LIB libspdk_accel_dpdk_compressdev.a 00:04:09.193 SO libspdk_blobfs_bdev.so.6.0 00:04:09.193 SO libspdk_accel_dpdk_compressdev.so.3.0 00:04:09.193 SO libspdk_bdev_error.so.6.0 00:04:09.193 SYMLINK libspdk_blobfs_bdev.so 00:04:09.193 LIB libspdk_bdev_crypto.a 00:04:09.193 LIB libspdk_bdev_null.a 00:04:09.193 SO libspdk_bdev_crypto.so.6.0 00:04:09.193 SYMLINK libspdk_accel_dpdk_compressdev.so 00:04:09.193 SYMLINK libspdk_bdev_error.so 00:04:09.193 LIB libspdk_bdev_split.a 00:04:09.193 LIB libspdk_bdev_zone_block.a 00:04:09.193 SO libspdk_bdev_null.so.6.0 00:04:09.193 LIB libspdk_bdev_malloc.a 00:04:09.193 SO libspdk_bdev_zone_block.so.6.0 00:04:09.193 SO libspdk_bdev_split.so.6.0 00:04:09.193 LIB libspdk_bdev_delay.a 00:04:09.193 LIB libspdk_bdev_compress.a 00:04:09.193 SYMLINK libspdk_bdev_crypto.so 00:04:09.193 LIB libspdk_bdev_ftl.a 00:04:09.193 SO libspdk_bdev_delay.so.6.0 00:04:09.193 LIB libspdk_bdev_passthru.a 00:04:09.193 SO libspdk_bdev_malloc.so.6.0 00:04:09.193 SYMLINK libspdk_bdev_null.so 00:04:09.193 SO libspdk_bdev_compress.so.6.0 00:04:09.193 LIB libspdk_bdev_aio.a 00:04:09.193 SYMLINK libspdk_bdev_zone_block.so 00:04:09.193 SO libspdk_bdev_ftl.so.6.0 00:04:09.193 SYMLINK libspdk_bdev_split.so 00:04:09.451 SO libspdk_bdev_passthru.so.6.0 00:04:09.451 SO libspdk_bdev_aio.so.6.0 00:04:09.451 LIB libspdk_bdev_gpt.a 00:04:09.451 SYMLINK libspdk_bdev_delay.so 00:04:09.451 LIB libspdk_bdev_virtio.a 00:04:09.452 SYMLINK libspdk_bdev_malloc.so 00:04:09.452 SYMLINK libspdk_bdev_compress.so 00:04:09.452 SYMLINK libspdk_bdev_ftl.so 00:04:09.452 SO libspdk_bdev_gpt.so.6.0 00:04:09.452 SYMLINK libspdk_bdev_passthru.so 00:04:09.452 SO libspdk_bdev_virtio.so.6.0 00:04:09.452 LIB libspdk_accel_dpdk_cryptodev.a 00:04:09.452 SYMLINK libspdk_bdev_aio.so 00:04:09.452 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:04:09.452 SYMLINK libspdk_bdev_gpt.so 00:04:09.452 LIB libspdk_bdev_lvol.a 00:04:09.452 SYMLINK libspdk_bdev_virtio.so 00:04:09.452 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:04:09.452 SO libspdk_bdev_lvol.so.6.0 00:04:09.710 SYMLINK libspdk_bdev_lvol.so 00:04:09.710 LIB libspdk_bdev_iscsi.a 00:04:09.710 LIB libspdk_bdev_raid.a 00:04:09.710 SO libspdk_bdev_iscsi.so.6.0 00:04:09.969 SO libspdk_bdev_raid.so.6.0 00:04:09.969 SYMLINK libspdk_bdev_iscsi.so 00:04:09.969 SYMLINK libspdk_bdev_raid.so 00:04:11.346 LIB libspdk_bdev_nvme.a 00:04:11.346 SO libspdk_bdev_nvme.so.7.0 00:04:11.346 SYMLINK libspdk_bdev_nvme.so 00:04:11.914 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:11.914 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:11.914 CC module/event/subsystems/iobuf/iobuf.o 00:04:11.914 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:11.914 CC module/event/subsystems/vmd/vmd.o 00:04:11.914 CC module/event/subsystems/scheduler/scheduler.o 00:04:11.914 CC module/event/subsystems/keyring/keyring.o 00:04:11.914 CC module/event/subsystems/sock/sock.o 00:04:12.173 LIB libspdk_event_vhost_blk.a 00:04:12.173 LIB libspdk_event_scheduler.a 00:04:12.173 LIB libspdk_event_keyring.a 00:04:12.173 LIB libspdk_event_vmd.a 00:04:12.173 LIB libspdk_event_iobuf.a 00:04:12.173 SO libspdk_event_vhost_blk.so.3.0 00:04:12.173 LIB libspdk_event_sock.a 00:04:12.173 SO libspdk_event_keyring.so.1.0 00:04:12.173 SO libspdk_event_scheduler.so.4.0 00:04:12.173 SO libspdk_event_vmd.so.6.0 00:04:12.173 SO libspdk_event_iobuf.so.3.0 00:04:12.173 SYMLINK libspdk_event_vhost_blk.so 00:04:12.173 SO libspdk_event_sock.so.5.0 00:04:12.173 SYMLINK libspdk_event_keyring.so 00:04:12.173 SYMLINK libspdk_event_scheduler.so 00:04:12.173 SYMLINK libspdk_event_iobuf.so 00:04:12.432 SYMLINK libspdk_event_sock.so 00:04:12.432 SYMLINK libspdk_event_vmd.so 00:04:12.689 CC module/event/subsystems/accel/accel.o 00:04:12.689 LIB libspdk_event_accel.a 00:04:12.947 SO libspdk_event_accel.so.6.0 00:04:12.947 SYMLINK libspdk_event_accel.so 00:04:13.206 CC module/event/subsystems/bdev/bdev.o 00:04:13.465 LIB libspdk_event_bdev.a 00:04:13.465 SO libspdk_event_bdev.so.6.0 00:04:13.724 SYMLINK libspdk_event_bdev.so 00:04:13.982 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:13.982 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:13.982 CC module/event/subsystems/scsi/scsi.o 00:04:13.982 CC module/event/subsystems/ublk/ublk.o 00:04:13.982 CC module/event/subsystems/nbd/nbd.o 00:04:14.241 LIB libspdk_event_nbd.a 00:04:14.241 LIB libspdk_event_scsi.a 00:04:14.241 LIB libspdk_event_ublk.a 00:04:14.241 SO libspdk_event_nbd.so.6.0 00:04:14.241 SO libspdk_event_scsi.so.6.0 00:04:14.241 SO libspdk_event_ublk.so.3.0 00:04:14.241 LIB libspdk_event_nvmf.a 00:04:14.241 SYMLINK libspdk_event_nbd.so 00:04:14.241 SO libspdk_event_nvmf.so.6.0 00:04:14.241 SYMLINK libspdk_event_scsi.so 00:04:14.241 SYMLINK libspdk_event_ublk.so 00:04:14.241 SYMLINK libspdk_event_nvmf.so 00:04:14.500 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:14.759 CC module/event/subsystems/iscsi/iscsi.o 00:04:14.759 LIB libspdk_event_vhost_scsi.a 00:04:14.759 LIB libspdk_event_iscsi.a 00:04:14.759 SO libspdk_event_vhost_scsi.so.3.0 00:04:14.759 SO libspdk_event_iscsi.so.6.0 00:04:15.018 SYMLINK libspdk_event_vhost_scsi.so 00:04:15.018 SYMLINK libspdk_event_iscsi.so 00:04:15.276 SO libspdk.so.6.0 00:04:15.276 SYMLINK libspdk.so 00:04:15.540 CC app/trace_record/trace_record.o 00:04:15.540 CC app/spdk_nvme_discover/discovery_aer.o 00:04:15.540 CC app/spdk_lspci/spdk_lspci.o 00:04:15.540 CC test/rpc_client/rpc_client_test.o 00:04:15.540 CXX app/trace/trace.o 00:04:15.540 TEST_HEADER include/spdk/accel.h 00:04:15.540 TEST_HEADER include/spdk/accel_module.h 00:04:15.540 CC app/spdk_top/spdk_top.o 00:04:15.540 TEST_HEADER include/spdk/barrier.h 00:04:15.540 TEST_HEADER include/spdk/assert.h 00:04:15.540 TEST_HEADER include/spdk/base64.h 00:04:15.540 CC app/spdk_nvme_perf/perf.o 00:04:15.540 TEST_HEADER include/spdk/bdev.h 00:04:15.540 TEST_HEADER include/spdk/bdev_zone.h 00:04:15.540 TEST_HEADER include/spdk/bdev_module.h 00:04:15.540 TEST_HEADER include/spdk/bit_array.h 00:04:15.540 TEST_HEADER include/spdk/bit_pool.h 00:04:15.540 TEST_HEADER include/spdk/blob_bdev.h 00:04:15.540 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:15.540 CC app/spdk_nvme_identify/identify.o 00:04:15.540 TEST_HEADER include/spdk/blobfs.h 00:04:15.540 TEST_HEADER include/spdk/conf.h 00:04:15.540 TEST_HEADER include/spdk/blob.h 00:04:15.540 TEST_HEADER include/spdk/config.h 00:04:15.540 TEST_HEADER include/spdk/cpuset.h 00:04:15.540 TEST_HEADER include/spdk/crc32.h 00:04:15.540 TEST_HEADER include/spdk/crc16.h 00:04:15.540 TEST_HEADER include/spdk/crc64.h 00:04:15.540 TEST_HEADER include/spdk/dif.h 00:04:15.540 TEST_HEADER include/spdk/dma.h 00:04:15.540 TEST_HEADER include/spdk/endian.h 00:04:15.540 TEST_HEADER include/spdk/env_dpdk.h 00:04:15.540 TEST_HEADER include/spdk/env.h 00:04:15.540 TEST_HEADER include/spdk/fd_group.h 00:04:15.540 TEST_HEADER include/spdk/event.h 00:04:15.540 TEST_HEADER include/spdk/fd.h 00:04:15.540 TEST_HEADER include/spdk/file.h 00:04:15.540 TEST_HEADER include/spdk/gpt_spec.h 00:04:15.540 TEST_HEADER include/spdk/ftl.h 00:04:15.540 TEST_HEADER include/spdk/hexlify.h 00:04:15.540 TEST_HEADER include/spdk/histogram_data.h 00:04:15.540 TEST_HEADER include/spdk/idxd.h 00:04:15.540 TEST_HEADER include/spdk/idxd_spec.h 00:04:15.540 TEST_HEADER include/spdk/ioat.h 00:04:15.540 TEST_HEADER include/spdk/ioat_spec.h 00:04:15.540 TEST_HEADER include/spdk/iscsi_spec.h 00:04:15.540 TEST_HEADER include/spdk/init.h 00:04:15.540 TEST_HEADER include/spdk/json.h 00:04:15.540 TEST_HEADER include/spdk/jsonrpc.h 00:04:15.540 TEST_HEADER include/spdk/keyring.h 00:04:15.540 TEST_HEADER include/spdk/likely.h 00:04:15.540 TEST_HEADER include/spdk/keyring_module.h 00:04:15.540 TEST_HEADER include/spdk/log.h 00:04:15.540 TEST_HEADER include/spdk/memory.h 00:04:15.540 TEST_HEADER include/spdk/lvol.h 00:04:15.540 TEST_HEADER include/spdk/mmio.h 00:04:15.540 TEST_HEADER include/spdk/nbd.h 00:04:15.540 TEST_HEADER include/spdk/nvme.h 00:04:15.540 TEST_HEADER include/spdk/notify.h 00:04:15.540 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:15.540 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:15.540 TEST_HEADER include/spdk/nvme_intel.h 00:04:15.540 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:15.540 TEST_HEADER include/spdk/nvme_zns.h 00:04:15.540 TEST_HEADER include/spdk/nvme_spec.h 00:04:15.540 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:15.540 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:15.540 TEST_HEADER include/spdk/nvmf_spec.h 00:04:15.540 TEST_HEADER include/spdk/nvmf.h 00:04:15.540 CC app/iscsi_tgt/iscsi_tgt.o 00:04:15.540 CC app/spdk_dd/spdk_dd.o 00:04:15.540 TEST_HEADER include/spdk/nvmf_transport.h 00:04:15.540 TEST_HEADER include/spdk/opal.h 00:04:15.540 TEST_HEADER include/spdk/opal_spec.h 00:04:15.540 TEST_HEADER include/spdk/pci_ids.h 00:04:15.540 TEST_HEADER include/spdk/queue.h 00:04:15.540 TEST_HEADER include/spdk/pipe.h 00:04:15.540 TEST_HEADER include/spdk/reduce.h 00:04:15.540 TEST_HEADER include/spdk/rpc.h 00:04:15.540 TEST_HEADER include/spdk/scheduler.h 00:04:15.540 TEST_HEADER include/spdk/scsi.h 00:04:15.540 TEST_HEADER include/spdk/scsi_spec.h 00:04:15.540 TEST_HEADER include/spdk/sock.h 00:04:15.540 TEST_HEADER include/spdk/stdinc.h 00:04:15.540 TEST_HEADER include/spdk/string.h 00:04:15.540 TEST_HEADER include/spdk/thread.h 00:04:15.540 TEST_HEADER include/spdk/trace.h 00:04:15.540 TEST_HEADER include/spdk/trace_parser.h 00:04:15.540 TEST_HEADER include/spdk/tree.h 00:04:15.540 TEST_HEADER include/spdk/util.h 00:04:15.540 TEST_HEADER include/spdk/uuid.h 00:04:15.540 TEST_HEADER include/spdk/ublk.h 00:04:15.540 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:15.540 TEST_HEADER include/spdk/version.h 00:04:15.540 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:15.540 CC app/nvmf_tgt/nvmf_main.o 00:04:15.540 TEST_HEADER include/spdk/vhost.h 00:04:15.540 TEST_HEADER include/spdk/vmd.h 00:04:15.540 TEST_HEADER include/spdk/xor.h 00:04:15.540 TEST_HEADER include/spdk/zipf.h 00:04:15.540 CXX test/cpp_headers/accel.o 00:04:15.540 CXX test/cpp_headers/assert.o 00:04:15.540 CXX test/cpp_headers/accel_module.o 00:04:15.540 CXX test/cpp_headers/barrier.o 00:04:15.540 CXX test/cpp_headers/base64.o 00:04:15.540 CXX test/cpp_headers/bdev.o 00:04:15.540 CXX test/cpp_headers/bdev_module.o 00:04:15.540 CXX test/cpp_headers/bdev_zone.o 00:04:15.540 CXX test/cpp_headers/bit_pool.o 00:04:15.540 CC app/spdk_tgt/spdk_tgt.o 00:04:15.540 CXX test/cpp_headers/blobfs_bdev.o 00:04:15.540 CXX test/cpp_headers/blobfs.o 00:04:15.540 CXX test/cpp_headers/blob_bdev.o 00:04:15.540 CXX test/cpp_headers/bit_array.o 00:04:15.540 CXX test/cpp_headers/blob.o 00:04:15.540 CXX test/cpp_headers/conf.o 00:04:15.540 CXX test/cpp_headers/config.o 00:04:15.540 CXX test/cpp_headers/cpuset.o 00:04:15.540 CXX test/cpp_headers/crc16.o 00:04:15.540 CXX test/cpp_headers/crc32.o 00:04:15.540 CXX test/cpp_headers/crc64.o 00:04:15.540 CXX test/cpp_headers/dif.o 00:04:15.540 CXX test/cpp_headers/endian.o 00:04:15.540 CXX test/cpp_headers/env_dpdk.o 00:04:15.540 CXX test/cpp_headers/env.o 00:04:15.540 CXX test/cpp_headers/event.o 00:04:15.540 CXX test/cpp_headers/dma.o 00:04:15.540 CXX test/cpp_headers/fd_group.o 00:04:15.540 CXX test/cpp_headers/fd.o 00:04:15.541 CXX test/cpp_headers/file.o 00:04:15.541 CXX test/cpp_headers/ftl.o 00:04:15.541 CXX test/cpp_headers/gpt_spec.o 00:04:15.541 CXX test/cpp_headers/hexlify.o 00:04:15.541 CXX test/cpp_headers/histogram_data.o 00:04:15.541 CXX test/cpp_headers/idxd_spec.o 00:04:15.541 CXX test/cpp_headers/idxd.o 00:04:15.541 CXX test/cpp_headers/init.o 00:04:15.541 CXX test/cpp_headers/ioat.o 00:04:15.541 CXX test/cpp_headers/iscsi_spec.o 00:04:15.541 CXX test/cpp_headers/ioat_spec.o 00:04:15.541 CXX test/cpp_headers/json.o 00:04:15.541 CXX test/cpp_headers/jsonrpc.o 00:04:15.541 CXX test/cpp_headers/keyring.o 00:04:15.541 CC examples/util/zipf/zipf.o 00:04:15.803 CXX test/cpp_headers/keyring_module.o 00:04:15.804 CC examples/ioat/perf/perf.o 00:04:15.804 CC test/app/histogram_perf/histogram_perf.o 00:04:15.804 CC test/env/pci/pci_ut.o 00:04:15.804 CC examples/ioat/verify/verify.o 00:04:15.804 CC app/fio/nvme/fio_plugin.o 00:04:15.804 CC test/app/jsoncat/jsoncat.o 00:04:15.804 CC test/env/memory/memory_ut.o 00:04:15.804 CC test/env/vtophys/vtophys.o 00:04:15.804 CC test/thread/poller_perf/poller_perf.o 00:04:15.804 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:15.804 CC test/app/stub/stub.o 00:04:15.804 CC app/fio/bdev/fio_plugin.o 00:04:15.804 CC test/dma/test_dma/test_dma.o 00:04:15.804 CC test/app/bdev_svc/bdev_svc.o 00:04:16.064 LINK spdk_lspci 00:04:16.064 LINK spdk_trace_record 00:04:16.064 LINK rpc_client_test 00:04:16.064 CC test/env/mem_callbacks/mem_callbacks.o 00:04:16.064 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:16.064 LINK interrupt_tgt 00:04:16.064 LINK spdk_nvme_discover 00:04:16.064 LINK iscsi_tgt 00:04:16.064 LINK zipf 00:04:16.064 LINK jsoncat 00:04:16.064 CXX test/cpp_headers/likely.o 00:04:16.064 CXX test/cpp_headers/log.o 00:04:16.064 LINK histogram_perf 00:04:16.064 CXX test/cpp_headers/lvol.o 00:04:16.064 CXX test/cpp_headers/memory.o 00:04:16.064 CXX test/cpp_headers/mmio.o 00:04:16.064 CXX test/cpp_headers/nbd.o 00:04:16.064 CXX test/cpp_headers/notify.o 00:04:16.064 CXX test/cpp_headers/nvme.o 00:04:16.064 CXX test/cpp_headers/nvme_ocssd.o 00:04:16.064 CXX test/cpp_headers/nvme_intel.o 00:04:16.326 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:16.326 CXX test/cpp_headers/nvme_spec.o 00:04:16.326 CXX test/cpp_headers/nvme_zns.o 00:04:16.326 LINK nvmf_tgt 00:04:16.326 CXX test/cpp_headers/nvmf_cmd.o 00:04:16.326 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:16.326 LINK poller_perf 00:04:16.326 LINK vtophys 00:04:16.326 CXX test/cpp_headers/nvmf.o 00:04:16.326 CXX test/cpp_headers/nvmf_spec.o 00:04:16.326 CXX test/cpp_headers/nvmf_transport.o 00:04:16.326 CXX test/cpp_headers/opal.o 00:04:16.326 CXX test/cpp_headers/opal_spec.o 00:04:16.326 LINK ioat_perf 00:04:16.326 CXX test/cpp_headers/pci_ids.o 00:04:16.326 LINK verify 00:04:16.326 CXX test/cpp_headers/pipe.o 00:04:16.326 CXX test/cpp_headers/queue.o 00:04:16.326 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:16.326 LINK spdk_tgt 00:04:16.326 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:16.326 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:16.326 CXX test/cpp_headers/reduce.o 00:04:16.326 CXX test/cpp_headers/rpc.o 00:04:16.326 CXX test/cpp_headers/scheduler.o 00:04:16.326 LINK env_dpdk_post_init 00:04:16.326 CXX test/cpp_headers/scsi.o 00:04:16.326 CXX test/cpp_headers/sock.o 00:04:16.326 CXX test/cpp_headers/scsi_spec.o 00:04:16.326 CXX test/cpp_headers/stdinc.o 00:04:16.326 CXX test/cpp_headers/string.o 00:04:16.326 CXX test/cpp_headers/thread.o 00:04:16.326 CXX test/cpp_headers/trace.o 00:04:16.326 CXX test/cpp_headers/trace_parser.o 00:04:16.326 CXX test/cpp_headers/tree.o 00:04:16.326 LINK stub 00:04:16.326 CXX test/cpp_headers/ublk.o 00:04:16.326 CXX test/cpp_headers/util.o 00:04:16.326 CXX test/cpp_headers/uuid.o 00:04:16.326 LINK spdk_dd 00:04:16.326 CXX test/cpp_headers/vfio_user_pci.o 00:04:16.326 CXX test/cpp_headers/version.o 00:04:16.326 CXX test/cpp_headers/vfio_user_spec.o 00:04:16.326 LINK bdev_svc 00:04:16.326 LINK spdk_trace 00:04:16.326 CXX test/cpp_headers/vhost.o 00:04:16.327 CXX test/cpp_headers/vmd.o 00:04:16.327 CXX test/cpp_headers/xor.o 00:04:16.586 CXX test/cpp_headers/zipf.o 00:04:16.586 LINK test_dma 00:04:16.845 LINK nvme_fuzz 00:04:16.845 LINK spdk_nvme 00:04:16.845 LINK spdk_bdev 00:04:16.845 CC examples/idxd/perf/perf.o 00:04:16.845 CC examples/vmd/lsvmd/lsvmd.o 00:04:16.845 CC examples/vmd/led/led.o 00:04:16.845 CC examples/sock/hello_world/hello_sock.o 00:04:16.845 CC test/event/reactor_perf/reactor_perf.o 00:04:16.845 CC test/event/reactor/reactor.o 00:04:16.845 LINK mem_callbacks 00:04:16.845 CC test/event/event_perf/event_perf.o 00:04:16.845 CC test/event/app_repeat/app_repeat.o 00:04:16.845 CC test/event/scheduler/scheduler.o 00:04:16.845 CC examples/thread/thread/thread_ex.o 00:04:16.845 CC app/vhost/vhost.o 00:04:16.845 LINK spdk_top 00:04:17.104 LINK lsvmd 00:04:17.104 LINK led 00:04:17.104 LINK spdk_nvme_perf 00:04:17.104 LINK reactor_perf 00:04:17.104 LINK vhost_fuzz 00:04:17.104 LINK spdk_nvme_identify 00:04:17.104 LINK reactor 00:04:17.104 LINK event_perf 00:04:17.104 LINK app_repeat 00:04:17.104 LINK hello_sock 00:04:17.104 LINK vhost 00:04:17.104 LINK scheduler 00:04:17.104 LINK idxd_perf 00:04:17.104 LINK pci_ut 00:04:17.104 LINK thread 00:04:17.363 CC test/nvme/sgl/sgl.o 00:04:17.363 CC test/nvme/aer/aer.o 00:04:17.363 CC test/nvme/e2edp/nvme_dp.o 00:04:17.363 CC test/nvme/boot_partition/boot_partition.o 00:04:17.363 CC test/nvme/simple_copy/simple_copy.o 00:04:17.363 CC test/nvme/fused_ordering/fused_ordering.o 00:04:17.363 CC test/nvme/compliance/nvme_compliance.o 00:04:17.363 CC test/nvme/fdp/fdp.o 00:04:17.363 CC test/nvme/reset/reset.o 00:04:17.363 CC test/nvme/overhead/overhead.o 00:04:17.363 CC test/nvme/cuse/cuse.o 00:04:17.363 CC test/nvme/startup/startup.o 00:04:17.363 CC test/nvme/connect_stress/connect_stress.o 00:04:17.363 CC test/nvme/reserve/reserve.o 00:04:17.363 CC test/nvme/err_injection/err_injection.o 00:04:17.363 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:17.363 CC test/accel/dif/dif.o 00:04:17.363 CC test/blobfs/mkfs/mkfs.o 00:04:17.363 CC test/lvol/esnap/esnap.o 00:04:17.622 LINK boot_partition 00:04:17.622 LINK err_injection 00:04:17.622 LINK startup 00:04:17.622 LINK doorbell_aers 00:04:17.622 LINK connect_stress 00:04:17.622 CC examples/nvme/hotplug/hotplug.o 00:04:17.622 LINK fused_ordering 00:04:17.622 LINK reserve 00:04:17.622 CC examples/nvme/arbitration/arbitration.o 00:04:17.622 LINK memory_ut 00:04:17.622 CC examples/nvme/abort/abort.o 00:04:17.622 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:17.622 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:17.622 CC examples/nvme/reconnect/reconnect.o 00:04:17.622 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:17.622 CC examples/nvme/hello_world/hello_world.o 00:04:17.622 LINK overhead 00:04:17.622 LINK simple_copy 00:04:17.622 LINK fdp 00:04:17.622 LINK sgl 00:04:17.622 LINK reset 00:04:17.622 LINK nvme_dp 00:04:17.622 LINK aer 00:04:17.622 LINK nvme_compliance 00:04:17.622 CC examples/accel/perf/accel_perf.o 00:04:17.622 LINK mkfs 00:04:17.622 CC examples/blob/cli/blobcli.o 00:04:17.622 CC examples/blob/hello_world/hello_blob.o 00:04:17.622 LINK pmr_persistence 00:04:17.882 LINK cmb_copy 00:04:17.882 LINK hotplug 00:04:17.882 LINK dif 00:04:17.882 LINK reconnect 00:04:17.882 LINK hello_world 00:04:17.882 LINK arbitration 00:04:17.882 LINK abort 00:04:17.882 LINK hello_blob 00:04:18.141 LINK nvme_manage 00:04:18.141 LINK accel_perf 00:04:18.141 LINK blobcli 00:04:18.399 LINK iscsi_fuzz 00:04:18.399 CC test/bdev/bdevio/bdevio.o 00:04:18.658 LINK cuse 00:04:18.658 CC examples/bdev/hello_world/hello_bdev.o 00:04:18.658 CC examples/bdev/bdevperf/bdevperf.o 00:04:18.917 LINK bdevio 00:04:19.176 LINK hello_bdev 00:04:19.435 LINK bdevperf 00:04:20.372 CC examples/nvmf/nvmf/nvmf.o 00:04:20.631 LINK nvmf 00:04:22.538 LINK esnap 00:04:23.105 00:04:23.105 real 1m31.680s 00:04:23.105 user 17m23.005s 00:04:23.105 sys 4m12.856s 00:04:23.105 22:11:33 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:04:23.105 22:11:33 make -- common/autotest_common.sh@10 -- $ set +x 00:04:23.105 ************************************ 00:04:23.105 END TEST make 00:04:23.105 ************************************ 00:04:23.105 22:11:33 -- common/autotest_common.sh@1142 -- $ return 0 00:04:23.105 22:11:33 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:23.105 22:11:33 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:23.105 22:11:33 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:23.105 22:11:33 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:23.105 22:11:33 -- pm/common@44 -- $ pid=3266254 00:04:23.105 22:11:33 -- pm/common@50 -- $ kill -TERM 3266254 00:04:23.105 22:11:33 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:23.105 22:11:33 -- pm/common@44 -- $ pid=3266255 00:04:23.105 22:11:33 -- pm/common@50 -- $ kill -TERM 3266255 00:04:23.105 22:11:33 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:23.105 22:11:33 -- pm/common@44 -- $ pid=3266256 00:04:23.105 22:11:33 -- pm/common@50 -- $ kill -TERM 3266256 00:04:23.105 22:11:33 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:23.105 22:11:33 -- pm/common@44 -- $ pid=3266281 00:04:23.105 22:11:33 -- pm/common@50 -- $ sudo -E kill -TERM 3266281 00:04:23.105 22:11:33 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:04:23.105 22:11:33 -- nvmf/common.sh@7 -- # uname -s 00:04:23.105 22:11:33 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:23.105 22:11:33 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:23.105 22:11:33 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:23.105 22:11:33 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:23.105 22:11:33 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:23.105 22:11:33 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:23.105 22:11:33 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:23.105 22:11:33 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:23.105 22:11:33 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:23.105 22:11:33 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:23.105 22:11:33 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:04:23.105 22:11:33 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:04:23.105 22:11:33 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:23.105 22:11:33 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:23.105 22:11:33 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:23.105 22:11:33 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:23.105 22:11:33 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:04:23.105 22:11:33 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:23.105 22:11:33 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:23.105 22:11:33 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:23.105 22:11:33 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:23.105 22:11:33 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:23.105 22:11:33 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:23.105 22:11:33 -- paths/export.sh@5 -- # export PATH 00:04:23.105 22:11:33 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:23.105 22:11:33 -- nvmf/common.sh@47 -- # : 0 00:04:23.105 22:11:33 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:04:23.105 22:11:33 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:04:23.105 22:11:33 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:23.105 22:11:33 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:23.105 22:11:33 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:23.105 22:11:33 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:04:23.105 22:11:33 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:04:23.105 22:11:33 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:04:23.105 22:11:33 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:23.105 22:11:33 -- spdk/autotest.sh@32 -- # uname -s 00:04:23.105 22:11:33 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:23.105 22:11:33 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:23.105 22:11:33 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:23.105 22:11:33 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:23.105 22:11:33 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:23.105 22:11:33 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:23.105 22:11:33 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:23.105 22:11:33 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:23.105 22:11:33 -- spdk/autotest.sh@48 -- # udevadm_pid=3332924 00:04:23.105 22:11:33 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:23.105 22:11:33 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:23.105 22:11:33 -- pm/common@17 -- # local monitor 00:04:23.105 22:11:33 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@21 -- # date +%s 00:04:23.105 22:11:33 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:23.105 22:11:33 -- pm/common@21 -- # date +%s 00:04:23.105 22:11:33 -- pm/common@25 -- # sleep 1 00:04:23.105 22:11:33 -- pm/common@21 -- # date +%s 00:04:23.105 22:11:33 -- pm/common@21 -- # date +%s 00:04:23.105 22:11:33 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720815093 00:04:23.105 22:11:33 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720815093 00:04:23.105 22:11:33 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720815093 00:04:23.105 22:11:33 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720815093 00:04:23.363 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720815093_collect-vmstat.pm.log 00:04:23.363 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720815093_collect-cpu-load.pm.log 00:04:23.363 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720815093_collect-cpu-temp.pm.log 00:04:23.363 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720815093_collect-bmc-pm.bmc.pm.log 00:04:24.299 22:11:34 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:24.299 22:11:34 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:24.299 22:11:34 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:24.299 22:11:34 -- common/autotest_common.sh@10 -- # set +x 00:04:24.299 22:11:34 -- spdk/autotest.sh@59 -- # create_test_list 00:04:24.299 22:11:34 -- common/autotest_common.sh@746 -- # xtrace_disable 00:04:24.299 22:11:34 -- common/autotest_common.sh@10 -- # set +x 00:04:24.299 22:11:34 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:04:24.299 22:11:34 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:24.299 22:11:34 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:24.299 22:11:34 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:04:24.299 22:11:34 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:24.299 22:11:34 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:24.299 22:11:34 -- common/autotest_common.sh@1455 -- # uname 00:04:24.299 22:11:34 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:24.299 22:11:34 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:24.299 22:11:34 -- common/autotest_common.sh@1475 -- # uname 00:04:24.299 22:11:34 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:24.299 22:11:34 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:04:24.299 22:11:34 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:04:24.299 22:11:34 -- spdk/autotest.sh@72 -- # hash lcov 00:04:24.299 22:11:34 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:04:24.299 22:11:34 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:04:24.299 --rc lcov_branch_coverage=1 00:04:24.299 --rc lcov_function_coverage=1 00:04:24.299 --rc genhtml_branch_coverage=1 00:04:24.299 --rc genhtml_function_coverage=1 00:04:24.299 --rc genhtml_legend=1 00:04:24.299 --rc geninfo_all_blocks=1 00:04:24.299 ' 00:04:24.299 22:11:34 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:04:24.299 --rc lcov_branch_coverage=1 00:04:24.299 --rc lcov_function_coverage=1 00:04:24.299 --rc genhtml_branch_coverage=1 00:04:24.299 --rc genhtml_function_coverage=1 00:04:24.299 --rc genhtml_legend=1 00:04:24.299 --rc geninfo_all_blocks=1 00:04:24.299 ' 00:04:24.299 22:11:34 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:04:24.299 --rc lcov_branch_coverage=1 00:04:24.299 --rc lcov_function_coverage=1 00:04:24.299 --rc genhtml_branch_coverage=1 00:04:24.299 --rc genhtml_function_coverage=1 00:04:24.299 --rc genhtml_legend=1 00:04:24.299 --rc geninfo_all_blocks=1 00:04:24.299 --no-external' 00:04:24.299 22:11:34 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:04:24.299 --rc lcov_branch_coverage=1 00:04:24.299 --rc lcov_function_coverage=1 00:04:24.299 --rc genhtml_branch_coverage=1 00:04:24.299 --rc genhtml_function_coverage=1 00:04:24.299 --rc genhtml_legend=1 00:04:24.300 --rc geninfo_all_blocks=1 00:04:24.300 --no-external' 00:04:24.300 22:11:34 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:04:24.300 lcov: LCOV version 1.14 00:04:24.300 22:11:34 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:04:28.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:04:28.486 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:04:28.487 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:04:28.487 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:04:28.746 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:04:28.746 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:04:28.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:04:28.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:29.006 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:04:29.006 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:04:43.968 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:04:43.968 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:50.536 22:11:59 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:50.536 22:11:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:50.536 22:11:59 -- common/autotest_common.sh@10 -- # set +x 00:04:50.536 22:11:59 -- spdk/autotest.sh@91 -- # rm -f 00:04:50.536 22:11:59 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:53.822 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:53.822 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:53.822 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:04:53.822 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:53.822 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:53.822 22:12:03 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:53.822 22:12:03 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:53.822 22:12:03 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:53.822 22:12:03 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:53.822 22:12:03 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:53.822 22:12:03 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:53.822 22:12:03 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:53.822 22:12:03 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:53.822 22:12:03 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:53.822 22:12:03 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:53.822 22:12:03 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:53.822 22:12:03 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:53.822 22:12:03 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:53.822 22:12:03 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:53.822 22:12:03 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:53.822 No valid GPT data, bailing 00:04:53.822 22:12:04 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:53.822 22:12:04 -- scripts/common.sh@391 -- # pt= 00:04:53.822 22:12:04 -- scripts/common.sh@392 -- # return 1 00:04:53.822 22:12:04 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:53.822 1+0 records in 00:04:53.822 1+0 records out 00:04:53.822 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00676167 s, 155 MB/s 00:04:53.822 22:12:04 -- spdk/autotest.sh@118 -- # sync 00:04:53.822 22:12:04 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:53.822 22:12:04 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:53.822 22:12:04 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:59.085 22:12:09 -- spdk/autotest.sh@124 -- # uname -s 00:04:59.086 22:12:09 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:59.086 22:12:09 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:59.086 22:12:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:59.086 22:12:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.086 22:12:09 -- common/autotest_common.sh@10 -- # set +x 00:04:59.086 ************************************ 00:04:59.086 START TEST setup.sh 00:04:59.086 ************************************ 00:04:59.086 22:12:09 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:59.086 * Looking for test storage... 00:04:59.086 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:59.086 22:12:09 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:59.086 22:12:09 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:59.086 22:12:09 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:59.086 22:12:09 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:59.086 22:12:09 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.086 22:12:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:59.086 ************************************ 00:04:59.086 START TEST acl 00:04:59.086 ************************************ 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:59.086 * Looking for test storage... 00:04:59.086 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:59.086 22:12:09 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:59.086 22:12:09 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:59.086 22:12:09 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:59.086 22:12:09 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:03.269 22:12:13 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:05:03.269 22:12:13 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:05:03.269 22:12:13 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:03.269 22:12:13 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:05:03.269 22:12:13 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.269 22:12:13 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.550 Hugepages 00:05:06.550 node hugesize free / total 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:06.550 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 00:05:06.551 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:05:06.551 22:12:16 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:05:06.551 22:12:16 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:06.551 22:12:16 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:06.551 22:12:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:06.551 ************************************ 00:05:06.551 START TEST denied 00:05:06.551 ************************************ 00:05:06.551 22:12:16 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:05:06.551 22:12:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:05:06.551 22:12:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:05:06.551 22:12:16 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:05:06.551 22:12:16 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.551 22:12:16 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:10.740 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:10.740 22:12:20 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:15.013 00:05:15.013 real 0m8.490s 00:05:15.013 user 0m2.621s 00:05:15.013 sys 0m5.102s 00:05:15.013 22:12:25 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:15.013 22:12:25 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:15.013 ************************************ 00:05:15.013 END TEST denied 00:05:15.013 ************************************ 00:05:15.013 22:12:25 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:15.013 22:12:25 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:15.013 22:12:25 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:15.013 22:12:25 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.013 22:12:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:15.013 ************************************ 00:05:15.013 START TEST allowed 00:05:15.013 ************************************ 00:05:15.014 22:12:25 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:05:15.014 22:12:25 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:05:15.014 22:12:25 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:15.014 22:12:25 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:15.014 22:12:25 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:15.014 22:12:25 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:05:21.581 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:21.581 22:12:31 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:21.581 22:12:31 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:21.582 22:12:31 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:21.582 22:12:31 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:21.582 22:12:31 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:25.775 00:05:25.775 real 0m10.504s 00:05:25.775 user 0m2.738s 00:05:25.775 sys 0m5.260s 00:05:25.775 22:12:35 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.775 22:12:35 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:25.775 ************************************ 00:05:25.775 END TEST allowed 00:05:25.775 ************************************ 00:05:25.775 22:12:35 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:25.775 00:05:25.775 real 0m26.567s 00:05:25.775 user 0m7.980s 00:05:25.775 sys 0m15.470s 00:05:25.775 22:12:35 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.775 22:12:35 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:25.775 ************************************ 00:05:25.775 END TEST acl 00:05:25.775 ************************************ 00:05:25.775 22:12:35 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:25.775 22:12:35 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:25.775 22:12:35 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.775 22:12:35 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.775 22:12:35 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:25.775 ************************************ 00:05:25.775 START TEST hugepages 00:05:25.775 ************************************ 00:05:25.775 22:12:35 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:25.775 * Looking for test storage... 00:05:25.775 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77049024 kB' 'MemAvailable: 80321064 kB' 'Buffers: 11136 kB' 'Cached: 9279312 kB' 'SwapCached: 0 kB' 'Active: 6316352 kB' 'Inactive: 3442188 kB' 'Active(anon): 5925736 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 471440 kB' 'Mapped: 156480 kB' 'Shmem: 5457644 kB' 'KReclaimable: 185760 kB' 'Slab: 491748 kB' 'SReclaimable: 185760 kB' 'SUnreclaim: 305988 kB' 'KernelStack: 16064 kB' 'PageTables: 7752 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7316200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200760 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.775 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:05:25.776 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:25.777 22:12:36 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:05:25.777 22:12:36 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.777 22:12:36 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.777 22:12:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:26.034 ************************************ 00:05:26.034 START TEST default_setup 00:05:26.034 ************************************ 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:26.034 22:12:36 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:30.222 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:30.222 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:30.222 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.222 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.222 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.222 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.222 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.223 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:32.761 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79178900 kB' 'MemAvailable: 82450852 kB' 'Buffers: 11136 kB' 'Cached: 9279428 kB' 'SwapCached: 0 kB' 'Active: 6336284 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945668 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490924 kB' 'Mapped: 156816 kB' 'Shmem: 5457760 kB' 'KReclaimable: 185584 kB' 'Slab: 490708 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305124 kB' 'KernelStack: 16240 kB' 'PageTables: 8220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7333664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.761 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.762 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79180572 kB' 'MemAvailable: 82452524 kB' 'Buffers: 11136 kB' 'Cached: 9279432 kB' 'SwapCached: 0 kB' 'Active: 6335832 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945216 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490492 kB' 'Mapped: 156760 kB' 'Shmem: 5457764 kB' 'KReclaimable: 185584 kB' 'Slab: 490676 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305092 kB' 'KernelStack: 16208 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7333684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.763 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79184064 kB' 'MemAvailable: 82456016 kB' 'Buffers: 11136 kB' 'Cached: 9279448 kB' 'SwapCached: 0 kB' 'Active: 6335960 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945344 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490624 kB' 'Mapped: 156752 kB' 'Shmem: 5457780 kB' 'KReclaimable: 185584 kB' 'Slab: 490708 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305124 kB' 'KernelStack: 16336 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7333704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.764 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.765 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:32.766 nr_hugepages=1024 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:32.766 resv_hugepages=0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:32.766 surplus_hugepages=0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:32.766 anon_hugepages=0 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79185088 kB' 'MemAvailable: 82457040 kB' 'Buffers: 11136 kB' 'Cached: 9279472 kB' 'SwapCached: 0 kB' 'Active: 6336636 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946020 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491316 kB' 'Mapped: 156752 kB' 'Shmem: 5457804 kB' 'KReclaimable: 185584 kB' 'Slab: 490612 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305028 kB' 'KernelStack: 16512 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7333728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201128 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.766 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.767 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37069180 kB' 'MemUsed: 11047760 kB' 'SwapCached: 0 kB' 'Active: 4873232 kB' 'Inactive: 3371792 kB' 'Active(anon): 4715336 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005424 kB' 'Mapped: 80124 kB' 'AnonPages: 242380 kB' 'Shmem: 4475736 kB' 'KernelStack: 9224 kB' 'PageTables: 4724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 294792 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 188848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.768 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:32.769 node0=1024 expecting 1024 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:32.769 00:05:32.769 real 0m6.673s 00:05:32.769 user 0m1.790s 00:05:32.769 sys 0m2.736s 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.769 22:12:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:05:32.769 ************************************ 00:05:32.769 END TEST default_setup 00:05:32.769 ************************************ 00:05:32.769 22:12:42 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:32.769 22:12:42 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:05:32.769 22:12:42 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.769 22:12:42 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.769 22:12:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:32.769 ************************************ 00:05:32.769 START TEST per_node_1G_alloc 00:05:32.769 ************************************ 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:32.769 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:32.770 22:12:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:36.061 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:36.061 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:36.061 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:36.061 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.061 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79160600 kB' 'MemAvailable: 82432552 kB' 'Buffers: 11136 kB' 'Cached: 9279560 kB' 'SwapCached: 0 kB' 'Active: 6335836 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945220 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490500 kB' 'Mapped: 156716 kB' 'Shmem: 5457892 kB' 'KReclaimable: 185584 kB' 'Slab: 490960 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305376 kB' 'KernelStack: 16128 kB' 'PageTables: 7684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.061 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.062 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79161284 kB' 'MemAvailable: 82433236 kB' 'Buffers: 11136 kB' 'Cached: 9279564 kB' 'SwapCached: 0 kB' 'Active: 6335552 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944936 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490256 kB' 'Mapped: 156684 kB' 'Shmem: 5457896 kB' 'KReclaimable: 185584 kB' 'Slab: 490924 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305340 kB' 'KernelStack: 16128 kB' 'PageTables: 7676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.063 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.064 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79160528 kB' 'MemAvailable: 82432480 kB' 'Buffers: 11136 kB' 'Cached: 9279564 kB' 'SwapCached: 0 kB' 'Active: 6335592 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944976 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490288 kB' 'Mapped: 156684 kB' 'Shmem: 5457896 kB' 'KReclaimable: 185584 kB' 'Slab: 490924 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305340 kB' 'KernelStack: 16144 kB' 'PageTables: 7724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.065 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:36.066 nr_hugepages=1024 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:36.066 resv_hugepages=0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:36.066 surplus_hugepages=0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:36.066 anon_hugepages=0 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.066 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79159896 kB' 'MemAvailable: 82431848 kB' 'Buffers: 11136 kB' 'Cached: 9279604 kB' 'SwapCached: 0 kB' 'Active: 6335516 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944900 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490212 kB' 'Mapped: 156684 kB' 'Shmem: 5457936 kB' 'KReclaimable: 185584 kB' 'Slab: 490924 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305340 kB' 'KernelStack: 16112 kB' 'PageTables: 7636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.067 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:36.068 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38103548 kB' 'MemUsed: 10013392 kB' 'SwapCached: 0 kB' 'Active: 4871888 kB' 'Inactive: 3371792 kB' 'Active(anon): 4713992 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005452 kB' 'Mapped: 80112 kB' 'AnonPages: 241300 kB' 'Shmem: 4475764 kB' 'KernelStack: 8776 kB' 'PageTables: 3756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 294948 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 189004 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.331 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.332 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41056972 kB' 'MemUsed: 3119560 kB' 'SwapCached: 0 kB' 'Active: 1462940 kB' 'Inactive: 70396 kB' 'Active(anon): 1230220 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70396 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1285332 kB' 'Mapped: 75760 kB' 'AnonPages: 248136 kB' 'Shmem: 982216 kB' 'KernelStack: 7320 kB' 'PageTables: 3812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79640 kB' 'Slab: 195968 kB' 'SReclaimable: 79640 kB' 'SUnreclaim: 116328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.333 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:36.334 node0=512 expecting 512 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:36.334 node1=512 expecting 512 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:36.334 00:05:36.334 real 0m3.570s 00:05:36.334 user 0m1.339s 00:05:36.334 sys 0m2.293s 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:36.334 22:12:46 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:36.334 ************************************ 00:05:36.334 END TEST per_node_1G_alloc 00:05:36.334 ************************************ 00:05:36.334 22:12:46 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:36.334 22:12:46 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:05:36.334 22:12:46 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:36.334 22:12:46 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.334 22:12:46 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:36.334 ************************************ 00:05:36.334 START TEST even_2G_alloc 00:05:36.334 ************************************ 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:36.334 22:12:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:39.629 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:39.629 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:39.629 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:39.629 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:39.629 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79145148 kB' 'MemAvailable: 82417100 kB' 'Buffers: 11136 kB' 'Cached: 9279712 kB' 'SwapCached: 0 kB' 'Active: 6334908 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944292 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489028 kB' 'Mapped: 155768 kB' 'Shmem: 5458044 kB' 'KReclaimable: 185584 kB' 'Slab: 491400 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305816 kB' 'KernelStack: 16112 kB' 'PageTables: 7588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7326676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.629 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79144480 kB' 'MemAvailable: 82416432 kB' 'Buffers: 11136 kB' 'Cached: 9279716 kB' 'SwapCached: 0 kB' 'Active: 6334644 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944028 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489212 kB' 'Mapped: 155644 kB' 'Shmem: 5458048 kB' 'KReclaimable: 185584 kB' 'Slab: 491372 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305788 kB' 'KernelStack: 16096 kB' 'PageTables: 7536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7326692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.630 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.631 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79143724 kB' 'MemAvailable: 82415676 kB' 'Buffers: 11136 kB' 'Cached: 9279732 kB' 'SwapCached: 0 kB' 'Active: 6334848 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944232 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489356 kB' 'Mapped: 155644 kB' 'Shmem: 5458064 kB' 'KReclaimable: 185584 kB' 'Slab: 491372 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305788 kB' 'KernelStack: 16096 kB' 'PageTables: 7568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7326344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.632 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.633 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:39.634 nr_hugepages=1024 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:39.634 resv_hugepages=0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:39.634 surplus_hugepages=0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:39.634 anon_hugepages=0 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79143988 kB' 'MemAvailable: 82415940 kB' 'Buffers: 11136 kB' 'Cached: 9279752 kB' 'SwapCached: 0 kB' 'Active: 6334964 kB' 'Inactive: 3442188 kB' 'Active(anon): 5944348 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489472 kB' 'Mapped: 155644 kB' 'Shmem: 5458084 kB' 'KReclaimable: 185584 kB' 'Slab: 491364 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305780 kB' 'KernelStack: 16064 kB' 'PageTables: 7424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7326500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.634 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.635 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38085256 kB' 'MemUsed: 10031684 kB' 'SwapCached: 0 kB' 'Active: 4870868 kB' 'Inactive: 3371792 kB' 'Active(anon): 4712972 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005564 kB' 'Mapped: 79876 kB' 'AnonPages: 240240 kB' 'Shmem: 4475876 kB' 'KernelStack: 8760 kB' 'PageTables: 3668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 295344 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 189400 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.636 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.637 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41058580 kB' 'MemUsed: 3117952 kB' 'SwapCached: 0 kB' 'Active: 1463612 kB' 'Inactive: 70396 kB' 'Active(anon): 1230892 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70396 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1285376 kB' 'Mapped: 75768 kB' 'AnonPages: 248668 kB' 'Shmem: 982260 kB' 'KernelStack: 7320 kB' 'PageTables: 3832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79640 kB' 'Slab: 196020 kB' 'SReclaimable: 79640 kB' 'SUnreclaim: 116380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.638 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:39.639 node0=512 expecting 512 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:39.639 node1=512 expecting 512 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:39.639 00:05:39.639 real 0m3.335s 00:05:39.639 user 0m1.151s 00:05:39.639 sys 0m2.204s 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:39.639 22:12:49 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:39.639 ************************************ 00:05:39.639 END TEST even_2G_alloc 00:05:39.639 ************************************ 00:05:39.639 22:12:49 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:39.639 22:12:49 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:39.639 22:12:49 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:39.639 22:12:49 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.639 22:12:49 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:39.639 ************************************ 00:05:39.639 START TEST odd_alloc 00:05:39.639 ************************************ 00:05:39.639 22:12:49 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:05:39.639 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:39.639 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:05:39.639 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:39.898 22:12:49 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:43.242 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:43.242 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:43.242 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:43.242 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:43.242 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79155856 kB' 'MemAvailable: 82427808 kB' 'Buffers: 11136 kB' 'Cached: 9279876 kB' 'SwapCached: 0 kB' 'Active: 6336384 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945768 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490420 kB' 'Mapped: 155760 kB' 'Shmem: 5458208 kB' 'KReclaimable: 185584 kB' 'Slab: 490820 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305236 kB' 'KernelStack: 16096 kB' 'PageTables: 7548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7327520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.242 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.243 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79155648 kB' 'MemAvailable: 82427600 kB' 'Buffers: 11136 kB' 'Cached: 9279876 kB' 'SwapCached: 0 kB' 'Active: 6336228 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945612 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490736 kB' 'Mapped: 155656 kB' 'Shmem: 5458208 kB' 'KReclaimable: 185584 kB' 'Slab: 490780 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305196 kB' 'KernelStack: 16032 kB' 'PageTables: 7352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7330152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200776 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.244 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.245 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79155416 kB' 'MemAvailable: 82427368 kB' 'Buffers: 11136 kB' 'Cached: 9279896 kB' 'SwapCached: 0 kB' 'Active: 6336224 kB' 'Inactive: 3442188 kB' 'Active(anon): 5945608 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490644 kB' 'Mapped: 155656 kB' 'Shmem: 5458228 kB' 'KReclaimable: 185584 kB' 'Slab: 490780 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305196 kB' 'KernelStack: 16080 kB' 'PageTables: 7648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7330172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.246 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.247 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.249 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:43.250 nr_hugepages=1025 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:43.250 resv_hugepages=0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:43.250 surplus_hugepages=0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:43.250 anon_hugepages=0 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79155704 kB' 'MemAvailable: 82427656 kB' 'Buffers: 11136 kB' 'Cached: 9279916 kB' 'SwapCached: 0 kB' 'Active: 6336952 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946336 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491344 kB' 'Mapped: 155656 kB' 'Shmem: 5458248 kB' 'KReclaimable: 185584 kB' 'Slab: 490780 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305196 kB' 'KernelStack: 16288 kB' 'PageTables: 7720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7330032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.250 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.251 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.252 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38098584 kB' 'MemUsed: 10018356 kB' 'SwapCached: 0 kB' 'Active: 4871484 kB' 'Inactive: 3371792 kB' 'Active(anon): 4713588 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005652 kB' 'Mapped: 79876 kB' 'AnonPages: 240832 kB' 'Shmem: 4475964 kB' 'KernelStack: 8776 kB' 'PageTables: 3808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 294856 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 188912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.253 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41056488 kB' 'MemUsed: 3120044 kB' 'SwapCached: 0 kB' 'Active: 1464940 kB' 'Inactive: 70396 kB' 'Active(anon): 1232220 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70396 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1285424 kB' 'Mapped: 75780 kB' 'AnonPages: 249968 kB' 'Shmem: 982308 kB' 'KernelStack: 7336 kB' 'PageTables: 3576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79640 kB' 'Slab: 195924 kB' 'SReclaimable: 79640 kB' 'SUnreclaim: 116284 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.254 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.255 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:43.256 node0=512 expecting 513 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:43.256 node1=513 expecting 512 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:43.256 00:05:43.256 real 0m3.525s 00:05:43.256 user 0m1.296s 00:05:43.256 sys 0m2.233s 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:43.256 22:12:53 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:43.256 ************************************ 00:05:43.256 END TEST odd_alloc 00:05:43.256 ************************************ 00:05:43.256 22:12:53 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:43.256 22:12:53 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:43.256 22:12:53 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:43.256 22:12:53 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.256 22:12:53 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:43.256 ************************************ 00:05:43.256 START TEST custom_alloc 00:05:43.256 ************************************ 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:43.256 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:43.257 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:43.516 22:12:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:46.810 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:46.810 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:46.810 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:46.810 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:46.810 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78088092 kB' 'MemAvailable: 81360044 kB' 'Buffers: 11136 kB' 'Cached: 9280020 kB' 'SwapCached: 0 kB' 'Active: 6336948 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946332 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491216 kB' 'Mapped: 155728 kB' 'Shmem: 5458352 kB' 'KReclaimable: 185584 kB' 'Slab: 490904 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305320 kB' 'KernelStack: 16160 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7328180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.075 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.076 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78087956 kB' 'MemAvailable: 81359908 kB' 'Buffers: 11136 kB' 'Cached: 9280020 kB' 'SwapCached: 0 kB' 'Active: 6337036 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946420 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491316 kB' 'Mapped: 155672 kB' 'Shmem: 5458352 kB' 'KReclaimable: 185584 kB' 'Slab: 490904 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305320 kB' 'KernelStack: 16128 kB' 'PageTables: 7632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7329324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.077 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.078 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78088664 kB' 'MemAvailable: 81360616 kB' 'Buffers: 11136 kB' 'Cached: 9280040 kB' 'SwapCached: 0 kB' 'Active: 6336696 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946080 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490960 kB' 'Mapped: 155672 kB' 'Shmem: 5458372 kB' 'KReclaimable: 185584 kB' 'Slab: 491040 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305456 kB' 'KernelStack: 16128 kB' 'PageTables: 7300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7330588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.079 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:47.080 nr_hugepages=1536 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:47.080 resv_hugepages=0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:47.080 surplus_hugepages=0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:47.080 anon_hugepages=0 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.080 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78088664 kB' 'MemAvailable: 81360616 kB' 'Buffers: 11136 kB' 'Cached: 9280040 kB' 'SwapCached: 0 kB' 'Active: 6336960 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946344 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491252 kB' 'Mapped: 155672 kB' 'Shmem: 5458372 kB' 'KReclaimable: 185584 kB' 'Slab: 491040 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305456 kB' 'KernelStack: 16144 kB' 'PageTables: 7528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7329368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.081 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38078828 kB' 'MemUsed: 10038112 kB' 'SwapCached: 0 kB' 'Active: 4873028 kB' 'Inactive: 3371792 kB' 'Active(anon): 4715132 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005728 kB' 'Mapped: 79876 kB' 'AnonPages: 242236 kB' 'Shmem: 4476040 kB' 'KernelStack: 8936 kB' 'PageTables: 4136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 294984 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 189040 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.082 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.083 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40008388 kB' 'MemUsed: 4168144 kB' 'SwapCached: 0 kB' 'Active: 1464188 kB' 'Inactive: 70396 kB' 'Active(anon): 1231468 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70396 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1285512 kB' 'Mapped: 75796 kB' 'AnonPages: 249120 kB' 'Shmem: 982396 kB' 'KernelStack: 7288 kB' 'PageTables: 3620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 79640 kB' 'Slab: 196024 kB' 'SReclaimable: 79640 kB' 'SUnreclaim: 116384 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.084 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:47.085 node0=512 expecting 512 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:47.085 node1=1024 expecting 1024 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:47.085 00:05:47.085 real 0m3.836s 00:05:47.085 user 0m1.386s 00:05:47.085 sys 0m2.546s 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:47.085 22:12:57 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:47.085 ************************************ 00:05:47.085 END TEST custom_alloc 00:05:47.085 ************************************ 00:05:47.345 22:12:57 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:47.345 22:12:57 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:47.345 22:12:57 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.345 22:12:57 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.345 22:12:57 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:47.345 ************************************ 00:05:47.345 START TEST no_shrink_alloc 00:05:47.345 ************************************ 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:47.345 22:12:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:50.641 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:50.641 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:50.641 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:50.641 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:50.641 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:50.641 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79119892 kB' 'MemAvailable: 82391844 kB' 'Buffers: 11136 kB' 'Cached: 9280168 kB' 'SwapCached: 0 kB' 'Active: 6338300 kB' 'Inactive: 3442188 kB' 'Active(anon): 5947684 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492064 kB' 'Mapped: 155820 kB' 'Shmem: 5458500 kB' 'KReclaimable: 185584 kB' 'Slab: 491320 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305736 kB' 'KernelStack: 16112 kB' 'PageTables: 7612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7328744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.642 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79120740 kB' 'MemAvailable: 82392692 kB' 'Buffers: 11136 kB' 'Cached: 9280172 kB' 'SwapCached: 0 kB' 'Active: 6337360 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946744 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491604 kB' 'Mapped: 155696 kB' 'Shmem: 5458504 kB' 'KReclaimable: 185584 kB' 'Slab: 491256 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305672 kB' 'KernelStack: 16096 kB' 'PageTables: 7532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7328764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.643 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.644 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79119992 kB' 'MemAvailable: 82391944 kB' 'Buffers: 11136 kB' 'Cached: 9280188 kB' 'SwapCached: 0 kB' 'Active: 6338000 kB' 'Inactive: 3442188 kB' 'Active(anon): 5947384 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492236 kB' 'Mapped: 155696 kB' 'Shmem: 5458520 kB' 'KReclaimable: 185584 kB' 'Slab: 491264 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305680 kB' 'KernelStack: 16112 kB' 'PageTables: 7620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7362028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.645 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.646 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.907 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:50.908 nr_hugepages=1024 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:50.908 resv_hugepages=0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:50.908 surplus_hugepages=0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:50.908 anon_hugepages=0 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79120248 kB' 'MemAvailable: 82392200 kB' 'Buffers: 11136 kB' 'Cached: 9280212 kB' 'SwapCached: 0 kB' 'Active: 6337332 kB' 'Inactive: 3442188 kB' 'Active(anon): 5946716 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491492 kB' 'Mapped: 155696 kB' 'Shmem: 5458544 kB' 'KReclaimable: 185584 kB' 'Slab: 491264 kB' 'SReclaimable: 185584 kB' 'SUnreclaim: 305680 kB' 'KernelStack: 16080 kB' 'PageTables: 7488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7328444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.908 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37013612 kB' 'MemUsed: 11103328 kB' 'SwapCached: 0 kB' 'Active: 4871396 kB' 'Inactive: 3371792 kB' 'Active(anon): 4713500 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005740 kB' 'Mapped: 79880 kB' 'AnonPages: 240640 kB' 'Shmem: 4476052 kB' 'KernelStack: 8760 kB' 'PageTables: 3720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 295268 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 189324 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:50.909 node0=1024 expecting 1024 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:50.909 22:13:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:54.224 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:54.224 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:54.224 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:54.224 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:54.224 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:54.224 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.224 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79129708 kB' 'MemAvailable: 82401656 kB' 'Buffers: 11136 kB' 'Cached: 9280300 kB' 'SwapCached: 0 kB' 'Active: 6339756 kB' 'Inactive: 3442188 kB' 'Active(anon): 5949140 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493796 kB' 'Mapped: 155724 kB' 'Shmem: 5458632 kB' 'KReclaimable: 185576 kB' 'Slab: 490504 kB' 'SReclaimable: 185576 kB' 'SUnreclaim: 304928 kB' 'KernelStack: 16432 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201256 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.225 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79130484 kB' 'MemAvailable: 82402432 kB' 'Buffers: 11136 kB' 'Cached: 9280300 kB' 'SwapCached: 0 kB' 'Active: 6339428 kB' 'Inactive: 3442188 kB' 'Active(anon): 5948812 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493456 kB' 'Mapped: 155724 kB' 'Shmem: 5458632 kB' 'KReclaimable: 185576 kB' 'Slab: 490504 kB' 'SReclaimable: 185576 kB' 'SUnreclaim: 304928 kB' 'KernelStack: 16640 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331892 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.226 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.227 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79131544 kB' 'MemAvailable: 82403492 kB' 'Buffers: 11136 kB' 'Cached: 9280300 kB' 'SwapCached: 0 kB' 'Active: 6338676 kB' 'Inactive: 3442188 kB' 'Active(anon): 5948060 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492628 kB' 'Mapped: 155716 kB' 'Shmem: 5458632 kB' 'KReclaimable: 185576 kB' 'Slab: 490588 kB' 'SReclaimable: 185576 kB' 'SUnreclaim: 305012 kB' 'KernelStack: 16304 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7330440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.228 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.229 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:54.230 nr_hugepages=1024 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:54.230 resv_hugepages=0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:54.230 surplus_hugepages=0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:54.230 anon_hugepages=0 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79131636 kB' 'MemAvailable: 82403584 kB' 'Buffers: 11136 kB' 'Cached: 9280304 kB' 'SwapCached: 0 kB' 'Active: 6338672 kB' 'Inactive: 3442188 kB' 'Active(anon): 5948056 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3442188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492612 kB' 'Mapped: 155716 kB' 'Shmem: 5458636 kB' 'KReclaimable: 185576 kB' 'Slab: 490588 kB' 'SReclaimable: 185576 kB' 'SUnreclaim: 305012 kB' 'KernelStack: 16272 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7331936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 48320 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 728484 kB' 'DirectMap2M: 13627392 kB' 'DirectMap1G: 87031808 kB' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.230 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:54.231 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37031124 kB' 'MemUsed: 11085816 kB' 'SwapCached: 0 kB' 'Active: 4871508 kB' 'Inactive: 3371792 kB' 'Active(anon): 4713612 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371792 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8005740 kB' 'Mapped: 79892 kB' 'AnonPages: 240668 kB' 'Shmem: 4476052 kB' 'KernelStack: 8920 kB' 'PageTables: 3668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105944 kB' 'Slab: 294500 kB' 'SReclaimable: 105944 kB' 'SUnreclaim: 188556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.232 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:54.233 node0=1024 expecting 1024 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:54.233 00:05:54.233 real 0m6.698s 00:05:54.233 user 0m2.423s 00:05:54.233 sys 0m4.334s 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:54.233 22:13:04 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:54.233 ************************************ 00:05:54.233 END TEST no_shrink_alloc 00:05:54.233 ************************************ 00:05:54.233 22:13:04 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:54.233 22:13:04 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:54.233 00:05:54.233 real 0m28.286s 00:05:54.233 user 0m9.637s 00:05:54.233 sys 0m16.790s 00:05:54.233 22:13:04 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:54.233 22:13:04 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:54.233 ************************************ 00:05:54.233 END TEST hugepages 00:05:54.233 ************************************ 00:05:54.233 22:13:04 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:54.233 22:13:04 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:54.233 22:13:04 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:54.233 22:13:04 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:54.233 22:13:04 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:54.233 ************************************ 00:05:54.233 START TEST driver 00:05:54.233 ************************************ 00:05:54.233 22:13:04 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:54.233 * Looking for test storage... 00:05:54.233 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:54.233 22:13:04 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:54.233 22:13:04 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:54.233 22:13:04 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:59.509 22:13:08 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:59.509 22:13:08 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:59.509 22:13:08 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.509 22:13:08 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:59.509 ************************************ 00:05:59.509 START TEST guess_driver 00:05:59.509 ************************************ 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:59.509 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:59.509 Looking for driver=vfio-pci 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.509 22:13:08 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.041 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.299 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.300 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:02.559 22:13:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:05.090 22:13:15 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:10.390 00:06:10.390 real 0m11.248s 00:06:10.390 user 0m2.914s 00:06:10.390 sys 0m5.265s 00:06:10.390 22:13:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.390 22:13:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:06:10.390 ************************************ 00:06:10.390 END TEST guess_driver 00:06:10.390 ************************************ 00:06:10.390 22:13:20 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:06:10.390 00:06:10.390 real 0m15.919s 00:06:10.390 user 0m4.133s 00:06:10.390 sys 0m7.832s 00:06:10.390 22:13:20 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.390 22:13:20 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:10.390 ************************************ 00:06:10.390 END TEST driver 00:06:10.390 ************************************ 00:06:10.390 22:13:20 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:10.390 22:13:20 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:10.390 22:13:20 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:10.390 22:13:20 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.390 22:13:20 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:10.390 ************************************ 00:06:10.390 START TEST devices 00:06:10.390 ************************************ 00:06:10.390 22:13:20 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:10.390 * Looking for test storage... 00:06:10.390 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:10.390 22:13:20 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:06:10.390 22:13:20 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:06:10.390 22:13:20 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:10.390 22:13:20 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:06:14.581 22:13:24 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:06:14.581 No valid GPT data, bailing 00:06:14.581 22:13:24 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:06:14.581 22:13:24 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:06:14.581 22:13:24 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.581 22:13:24 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:14.581 ************************************ 00:06:14.581 START TEST nvme_mount 00:06:14.581 ************************************ 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:14.581 22:13:24 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:06:15.150 Creating new GPT entries in memory. 00:06:15.150 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:15.150 other utilities. 00:06:15.150 22:13:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:15.150 22:13:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:15.150 22:13:25 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:15.150 22:13:25 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:15.150 22:13:25 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:16.525 Creating new GPT entries in memory. 00:06:16.525 The operation has completed successfully. 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 3364971 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:16.525 22:13:26 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:19.811 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:20.071 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:20.071 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:20.330 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:20.330 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:06:20.330 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:20.330 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:20.330 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.589 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:20.589 22:13:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:20.589 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:20.589 22:13:30 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.879 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:24.138 22:13:34 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.424 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:27.683 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:27.683 00:06:27.683 real 0m13.507s 00:06:27.683 user 0m4.112s 00:06:27.683 sys 0m7.294s 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.683 22:13:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:27.683 ************************************ 00:06:27.683 END TEST nvme_mount 00:06:27.683 ************************************ 00:06:27.683 22:13:38 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:27.683 22:13:38 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:27.683 22:13:38 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:27.683 22:13:38 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.683 22:13:38 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:27.942 ************************************ 00:06:27.942 START TEST dm_mount 00:06:27.942 ************************************ 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:27.942 22:13:38 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:28.878 Creating new GPT entries in memory. 00:06:28.878 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:28.878 other utilities. 00:06:28.878 22:13:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:28.879 22:13:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:28.879 22:13:39 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:28.879 22:13:39 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:28.879 22:13:39 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:29.817 Creating new GPT entries in memory. 00:06:29.817 The operation has completed successfully. 00:06:29.817 22:13:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:29.817 22:13:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:29.817 22:13:40 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:29.817 22:13:40 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:29.817 22:13:40 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:31.198 The operation has completed successfully. 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 3369250 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:31.198 22:13:41 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:34.496 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:34.756 22:13:44 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:38.082 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:38.082 00:06:38.082 real 0m10.305s 00:06:38.082 user 0m2.517s 00:06:38.082 sys 0m4.864s 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.082 22:13:48 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:38.082 ************************************ 00:06:38.082 END TEST dm_mount 00:06:38.082 ************************************ 00:06:38.082 22:13:48 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:38.082 22:13:48 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:38.342 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:38.342 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:06:38.342 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:38.342 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:38.342 22:13:48 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:38.601 22:13:48 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:38.601 00:06:38.601 real 0m28.368s 00:06:38.601 user 0m8.227s 00:06:38.601 sys 0m15.021s 00:06:38.601 22:13:48 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.601 22:13:48 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:38.601 ************************************ 00:06:38.601 END TEST devices 00:06:38.601 ************************************ 00:06:38.601 22:13:48 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:38.601 00:06:38.601 real 1m39.576s 00:06:38.601 user 0m30.128s 00:06:38.601 sys 0m55.433s 00:06:38.601 22:13:48 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:38.601 22:13:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:38.601 ************************************ 00:06:38.601 END TEST setup.sh 00:06:38.601 ************************************ 00:06:38.601 22:13:48 -- common/autotest_common.sh@1142 -- # return 0 00:06:38.601 22:13:48 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:41.902 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:41.902 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:41.902 Hugepages 00:06:41.902 node hugesize free / total 00:06:41.902 node0 1048576kB 0 / 0 00:06:41.902 node0 2048kB 1024 / 1024 00:06:41.902 node1 1048576kB 0 / 0 00:06:41.902 node1 2048kB 1024 / 1024 00:06:41.902 00:06:41.902 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:41.902 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:41.902 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:41.902 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:41.902 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:41.902 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:41.902 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:42.161 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:42.161 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:42.161 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:06:42.161 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:42.161 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:42.161 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:06:42.161 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:06:42.161 22:13:52 -- spdk/autotest.sh@130 -- # uname -s 00:06:42.161 22:13:52 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:42.161 22:13:52 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:42.161 22:13:52 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:46.364 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:46.364 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:46.364 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:46.364 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:48.900 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:48.900 22:13:58 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:49.838 22:13:59 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:49.838 22:13:59 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:49.838 22:13:59 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:49.838 22:13:59 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:49.838 22:13:59 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:49.838 22:13:59 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:49.838 22:13:59 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:49.838 22:13:59 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:49.838 22:13:59 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:49.838 22:13:59 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:49.838 22:13:59 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:49.838 22:13:59 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:53.131 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:53.131 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:53.131 Waiting for block devices as requested 00:06:53.131 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:06:53.131 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:53.131 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:53.131 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:53.131 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:53.390 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:53.390 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:53.390 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:53.648 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:53.648 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:53.648 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:53.907 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:53.907 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:53.907 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:53.907 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:54.167 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:54.167 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:54.167 22:14:04 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:54.167 22:14:04 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:54.167 22:14:04 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:54.167 22:14:04 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:54.167 22:14:04 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:54.167 22:14:04 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:54.167 22:14:04 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:06:54.167 22:14:04 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:54.167 22:14:04 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:54.167 22:14:04 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:54.167 22:14:04 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:54.167 22:14:04 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:54.427 22:14:04 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:54.427 22:14:04 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:54.427 22:14:04 -- common/autotest_common.sh@1557 -- # continue 00:06:54.427 22:14:04 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:54.427 22:14:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:54.427 22:14:04 -- common/autotest_common.sh@10 -- # set +x 00:06:54.427 22:14:04 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:54.427 22:14:04 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:54.427 22:14:04 -- common/autotest_common.sh@10 -- # set +x 00:06:54.427 22:14:04 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:57.714 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:57.714 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:57.714 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:57.714 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:57.974 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:07:00.511 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:07:00.511 22:14:10 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:07:00.511 22:14:10 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:00.511 22:14:10 -- common/autotest_common.sh@10 -- # set +x 00:07:00.511 22:14:10 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:07:00.511 22:14:10 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:07:00.511 22:14:10 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:07:00.511 22:14:10 -- common/autotest_common.sh@1577 -- # bdfs=() 00:07:00.511 22:14:10 -- common/autotest_common.sh@1577 -- # local bdfs 00:07:00.511 22:14:10 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:07:00.511 22:14:10 -- common/autotest_common.sh@1513 -- # bdfs=() 00:07:00.511 22:14:10 -- common/autotest_common.sh@1513 -- # local bdfs 00:07:00.511 22:14:10 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:00.511 22:14:10 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:00.511 22:14:10 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:07:00.511 22:14:10 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:07:00.511 22:14:10 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:07:00.511 22:14:10 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:07:00.511 22:14:10 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:07:00.511 22:14:10 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:07:00.511 22:14:10 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:07:00.511 22:14:10 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:07:00.511 22:14:10 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:07:00.511 22:14:10 -- common/autotest_common.sh@1593 -- # return 0 00:07:00.511 22:14:10 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:07:00.511 22:14:10 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:07:00.511 22:14:10 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:07:00.511 22:14:10 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:07:00.511 22:14:10 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:07:01.449 Restarting all devices. 00:07:05.681 lstat() error: No such file or directory 00:07:05.681 QAT Error: No GENERAL section found 00:07:05.681 Failed to configure qat_dev0 00:07:05.681 lstat() error: No such file or directory 00:07:05.681 QAT Error: No GENERAL section found 00:07:05.681 Failed to configure qat_dev1 00:07:05.681 lstat() error: No such file or directory 00:07:05.681 QAT Error: No GENERAL section found 00:07:05.681 Failed to configure qat_dev2 00:07:05.681 enable sriov 00:07:05.681 Checking status of all devices. 00:07:05.681 There is 3 QAT acceleration device(s) in the system: 00:07:05.681 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:07:05.681 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:07:05.681 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:07:06.619 0000:3d:00.0 set to 16 VFs 00:07:07.999 0000:3f:00.0 set to 16 VFs 00:07:09.376 0000:da:00.0 set to 16 VFs 00:07:12.665 Properly configured the qat device with driver uio_pci_generic. 00:07:12.665 22:14:22 -- spdk/autotest.sh@162 -- # timing_enter lib 00:07:12.665 22:14:22 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:12.665 22:14:22 -- common/autotest_common.sh@10 -- # set +x 00:07:12.665 22:14:22 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:07:12.665 22:14:22 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:12.665 22:14:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.665 22:14:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.665 22:14:22 -- common/autotest_common.sh@10 -- # set +x 00:07:12.665 ************************************ 00:07:12.665 START TEST env 00:07:12.665 ************************************ 00:07:12.665 22:14:22 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:12.665 * Looking for test storage... 00:07:12.665 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:07:12.665 22:14:22 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:12.665 22:14:22 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.665 22:14:22 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.665 22:14:22 env -- common/autotest_common.sh@10 -- # set +x 00:07:12.665 ************************************ 00:07:12.665 START TEST env_memory 00:07:12.665 ************************************ 00:07:12.665 22:14:22 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:12.665 00:07:12.665 00:07:12.665 CUnit - A unit testing framework for C - Version 2.1-3 00:07:12.665 http://cunit.sourceforge.net/ 00:07:12.665 00:07:12.665 00:07:12.665 Suite: memory 00:07:12.665 Test: alloc and free memory map ...[2024-07-12 22:14:22.789746] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:12.665 passed 00:07:12.665 Test: mem map translation ...[2024-07-12 22:14:22.819096] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:12.665 [2024-07-12 22:14:22.819120] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:12.665 [2024-07-12 22:14:22.819176] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:12.665 [2024-07-12 22:14:22.819191] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:12.665 passed 00:07:12.665 Test: mem map registration ...[2024-07-12 22:14:22.876944] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:12.665 [2024-07-12 22:14:22.876967] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:12.665 passed 00:07:12.665 Test: mem map adjacent registrations ...passed 00:07:12.665 00:07:12.665 Run Summary: Type Total Ran Passed Failed Inactive 00:07:12.665 suites 1 1 n/a 0 0 00:07:12.665 tests 4 4 4 0 0 00:07:12.665 asserts 152 152 152 0 n/a 00:07:12.665 00:07:12.665 Elapsed time = 0.198 seconds 00:07:12.665 00:07:12.665 real 0m0.213s 00:07:12.665 user 0m0.201s 00:07:12.665 sys 0m0.011s 00:07:12.665 22:14:22 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.665 22:14:22 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:07:12.665 ************************************ 00:07:12.665 END TEST env_memory 00:07:12.665 ************************************ 00:07:12.925 22:14:22 env -- common/autotest_common.sh@1142 -- # return 0 00:07:12.925 22:14:22 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:12.925 22:14:22 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.925 22:14:22 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.925 22:14:22 env -- common/autotest_common.sh@10 -- # set +x 00:07:12.925 ************************************ 00:07:12.925 START TEST env_vtophys 00:07:12.925 ************************************ 00:07:12.925 22:14:23 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:12.925 EAL: lib.eal log level changed from notice to debug 00:07:12.925 EAL: Detected lcore 0 as core 0 on socket 0 00:07:12.925 EAL: Detected lcore 1 as core 1 on socket 0 00:07:12.925 EAL: Detected lcore 2 as core 2 on socket 0 00:07:12.925 EAL: Detected lcore 3 as core 3 on socket 0 00:07:12.925 EAL: Detected lcore 4 as core 4 on socket 0 00:07:12.925 EAL: Detected lcore 5 as core 8 on socket 0 00:07:12.925 EAL: Detected lcore 6 as core 9 on socket 0 00:07:12.925 EAL: Detected lcore 7 as core 10 on socket 0 00:07:12.925 EAL: Detected lcore 8 as core 11 on socket 0 00:07:12.925 EAL: Detected lcore 9 as core 16 on socket 0 00:07:12.925 EAL: Detected lcore 10 as core 17 on socket 0 00:07:12.925 EAL: Detected lcore 11 as core 18 on socket 0 00:07:12.925 EAL: Detected lcore 12 as core 19 on socket 0 00:07:12.925 EAL: Detected lcore 13 as core 20 on socket 0 00:07:12.925 EAL: Detected lcore 14 as core 24 on socket 0 00:07:12.925 EAL: Detected lcore 15 as core 25 on socket 0 00:07:12.925 EAL: Detected lcore 16 as core 26 on socket 0 00:07:12.925 EAL: Detected lcore 17 as core 27 on socket 0 00:07:12.925 EAL: Detected lcore 18 as core 0 on socket 1 00:07:12.925 EAL: Detected lcore 19 as core 1 on socket 1 00:07:12.925 EAL: Detected lcore 20 as core 2 on socket 1 00:07:12.925 EAL: Detected lcore 21 as core 3 on socket 1 00:07:12.925 EAL: Detected lcore 22 as core 4 on socket 1 00:07:12.925 EAL: Detected lcore 23 as core 8 on socket 1 00:07:12.925 EAL: Detected lcore 24 as core 9 on socket 1 00:07:12.925 EAL: Detected lcore 25 as core 10 on socket 1 00:07:12.925 EAL: Detected lcore 26 as core 11 on socket 1 00:07:12.925 EAL: Detected lcore 27 as core 16 on socket 1 00:07:12.925 EAL: Detected lcore 28 as core 17 on socket 1 00:07:12.925 EAL: Detected lcore 29 as core 18 on socket 1 00:07:12.925 EAL: Detected lcore 30 as core 19 on socket 1 00:07:12.925 EAL: Detected lcore 31 as core 20 on socket 1 00:07:12.925 EAL: Detected lcore 32 as core 24 on socket 1 00:07:12.926 EAL: Detected lcore 33 as core 25 on socket 1 00:07:12.926 EAL: Detected lcore 34 as core 26 on socket 1 00:07:12.926 EAL: Detected lcore 35 as core 27 on socket 1 00:07:12.926 EAL: Detected lcore 36 as core 0 on socket 0 00:07:12.926 EAL: Detected lcore 37 as core 1 on socket 0 00:07:12.926 EAL: Detected lcore 38 as core 2 on socket 0 00:07:12.926 EAL: Detected lcore 39 as core 3 on socket 0 00:07:12.926 EAL: Detected lcore 40 as core 4 on socket 0 00:07:12.926 EAL: Detected lcore 41 as core 8 on socket 0 00:07:12.926 EAL: Detected lcore 42 as core 9 on socket 0 00:07:12.926 EAL: Detected lcore 43 as core 10 on socket 0 00:07:12.926 EAL: Detected lcore 44 as core 11 on socket 0 00:07:12.926 EAL: Detected lcore 45 as core 16 on socket 0 00:07:12.926 EAL: Detected lcore 46 as core 17 on socket 0 00:07:12.926 EAL: Detected lcore 47 as core 18 on socket 0 00:07:12.926 EAL: Detected lcore 48 as core 19 on socket 0 00:07:12.926 EAL: Detected lcore 49 as core 20 on socket 0 00:07:12.926 EAL: Detected lcore 50 as core 24 on socket 0 00:07:12.926 EAL: Detected lcore 51 as core 25 on socket 0 00:07:12.926 EAL: Detected lcore 52 as core 26 on socket 0 00:07:12.926 EAL: Detected lcore 53 as core 27 on socket 0 00:07:12.926 EAL: Detected lcore 54 as core 0 on socket 1 00:07:12.926 EAL: Detected lcore 55 as core 1 on socket 1 00:07:12.926 EAL: Detected lcore 56 as core 2 on socket 1 00:07:12.926 EAL: Detected lcore 57 as core 3 on socket 1 00:07:12.926 EAL: Detected lcore 58 as core 4 on socket 1 00:07:12.926 EAL: Detected lcore 59 as core 8 on socket 1 00:07:12.926 EAL: Detected lcore 60 as core 9 on socket 1 00:07:12.926 EAL: Detected lcore 61 as core 10 on socket 1 00:07:12.926 EAL: Detected lcore 62 as core 11 on socket 1 00:07:12.926 EAL: Detected lcore 63 as core 16 on socket 1 00:07:12.926 EAL: Detected lcore 64 as core 17 on socket 1 00:07:12.926 EAL: Detected lcore 65 as core 18 on socket 1 00:07:12.926 EAL: Detected lcore 66 as core 19 on socket 1 00:07:12.926 EAL: Detected lcore 67 as core 20 on socket 1 00:07:12.926 EAL: Detected lcore 68 as core 24 on socket 1 00:07:12.926 EAL: Detected lcore 69 as core 25 on socket 1 00:07:12.926 EAL: Detected lcore 70 as core 26 on socket 1 00:07:12.926 EAL: Detected lcore 71 as core 27 on socket 1 00:07:12.926 EAL: Maximum logical cores by configuration: 128 00:07:12.926 EAL: Detected CPU lcores: 72 00:07:12.926 EAL: Detected NUMA nodes: 2 00:07:12.926 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:07:12.926 EAL: Detected shared linkage of DPDK 00:07:12.926 EAL: No shared files mode enabled, IPC will be disabled 00:07:12.926 EAL: No shared files mode enabled, IPC is disabled 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:07:12.926 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:07:12.926 EAL: Bus pci wants IOVA as 'PA' 00:07:12.926 EAL: Bus auxiliary wants IOVA as 'DC' 00:07:12.926 EAL: Bus vdev wants IOVA as 'DC' 00:07:12.926 EAL: Selected IOVA mode 'PA' 00:07:12.926 EAL: Probing VFIO support... 00:07:12.926 EAL: IOMMU type 1 (Type 1) is supported 00:07:12.926 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:12.926 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:12.926 EAL: VFIO support initialized 00:07:12.926 EAL: Ask a virtual area of 0x2e000 bytes 00:07:12.926 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:12.926 EAL: Setting up physically contiguous memory... 00:07:12.926 EAL: Setting maximum number of open files to 524288 00:07:12.926 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:12.926 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:12.926 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:12.926 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:12.926 EAL: Ask a virtual area of 0x61000 bytes 00:07:12.926 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:12.926 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:12.926 EAL: Ask a virtual area of 0x400000000 bytes 00:07:12.926 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:12.926 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:12.926 EAL: Hugepages will be freed exactly as allocated. 00:07:12.926 EAL: No shared files mode enabled, IPC is disabled 00:07:12.926 EAL: No shared files mode enabled, IPC is disabled 00:07:12.926 EAL: TSC frequency is ~2300000 KHz 00:07:12.926 EAL: Main lcore 0 is ready (tid=7f45cca8ab00;cpuset=[0]) 00:07:12.926 EAL: Trying to obtain current memory policy. 00:07:12.926 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.926 EAL: Restoring previous memory policy: 0 00:07:12.926 EAL: request: mp_malloc_sync 00:07:12.926 EAL: No shared files mode enabled, IPC is disabled 00:07:12.926 EAL: Heap on socket 0 was expanded by 2MB 00:07:12.926 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:07:12.926 EAL: probe driver: 8086:37c9 qat 00:07:12.926 EAL: PCI memory mapped at 0x202001000000 00:07:12.926 EAL: PCI memory mapped at 0x202001001000 00:07:12.926 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:12.926 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:07:12.926 EAL: probe driver: 8086:37c9 qat 00:07:12.926 EAL: PCI memory mapped at 0x202001002000 00:07:12.926 EAL: PCI memory mapped at 0x202001003000 00:07:12.926 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:12.926 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:07:12.926 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001004000 00:07:12.927 EAL: PCI memory mapped at 0x202001005000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001006000 00:07:12.927 EAL: PCI memory mapped at 0x202001007000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001008000 00:07:12.927 EAL: PCI memory mapped at 0x202001009000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200100a000 00:07:12.927 EAL: PCI memory mapped at 0x20200100b000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200100c000 00:07:12.927 EAL: PCI memory mapped at 0x20200100d000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200100e000 00:07:12.927 EAL: PCI memory mapped at 0x20200100f000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001010000 00:07:12.927 EAL: PCI memory mapped at 0x202001011000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001012000 00:07:12.927 EAL: PCI memory mapped at 0x202001013000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001014000 00:07:12.927 EAL: PCI memory mapped at 0x202001015000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001016000 00:07:12.927 EAL: PCI memory mapped at 0x202001017000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001018000 00:07:12.927 EAL: PCI memory mapped at 0x202001019000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200101a000 00:07:12.927 EAL: PCI memory mapped at 0x20200101b000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200101c000 00:07:12.927 EAL: PCI memory mapped at 0x20200101d000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:12.927 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200101e000 00:07:12.927 EAL: PCI memory mapped at 0x20200101f000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001020000 00:07:12.927 EAL: PCI memory mapped at 0x202001021000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001022000 00:07:12.927 EAL: PCI memory mapped at 0x202001023000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001024000 00:07:12.927 EAL: PCI memory mapped at 0x202001025000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001026000 00:07:12.927 EAL: PCI memory mapped at 0x202001027000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001028000 00:07:12.927 EAL: PCI memory mapped at 0x202001029000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200102a000 00:07:12.927 EAL: PCI memory mapped at 0x20200102b000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200102c000 00:07:12.927 EAL: PCI memory mapped at 0x20200102d000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200102e000 00:07:12.927 EAL: PCI memory mapped at 0x20200102f000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001030000 00:07:12.927 EAL: PCI memory mapped at 0x202001031000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001032000 00:07:12.927 EAL: PCI memory mapped at 0x202001033000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001034000 00:07:12.927 EAL: PCI memory mapped at 0x202001035000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001036000 00:07:12.927 EAL: PCI memory mapped at 0x202001037000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001038000 00:07:12.927 EAL: PCI memory mapped at 0x202001039000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200103a000 00:07:12.927 EAL: PCI memory mapped at 0x20200103b000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200103c000 00:07:12.927 EAL: PCI memory mapped at 0x20200103d000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:12.927 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200103e000 00:07:12.927 EAL: PCI memory mapped at 0x20200103f000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:12.927 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001040000 00:07:12.927 EAL: PCI memory mapped at 0x202001041000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:12.927 EAL: Trying to obtain current memory policy. 00:07:12.927 EAL: Setting policy MPOL_PREFERRED for socket 1 00:07:12.927 EAL: Restoring previous memory policy: 4 00:07:12.927 EAL: request: mp_malloc_sync 00:07:12.927 EAL: No shared files mode enabled, IPC is disabled 00:07:12.927 EAL: Heap on socket 1 was expanded by 2MB 00:07:12.927 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001042000 00:07:12.927 EAL: PCI memory mapped at 0x202001043000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001044000 00:07:12.927 EAL: PCI memory mapped at 0x202001045000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001046000 00:07:12.927 EAL: PCI memory mapped at 0x202001047000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x202001048000 00:07:12.927 EAL: PCI memory mapped at 0x202001049000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200104a000 00:07:12.927 EAL: PCI memory mapped at 0x20200104b000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200104c000 00:07:12.927 EAL: PCI memory mapped at 0x20200104d000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:07:12.927 EAL: probe driver: 8086:37c9 qat 00:07:12.927 EAL: PCI memory mapped at 0x20200104e000 00:07:12.927 EAL: PCI memory mapped at 0x20200104f000 00:07:12.927 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:12.927 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x202001050000 00:07:12.928 EAL: PCI memory mapped at 0x202001051000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x202001052000 00:07:12.928 EAL: PCI memory mapped at 0x202001053000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x202001054000 00:07:12.928 EAL: PCI memory mapped at 0x202001055000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x202001056000 00:07:12.928 EAL: PCI memory mapped at 0x202001057000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x202001058000 00:07:12.928 EAL: PCI memory mapped at 0x202001059000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x20200105a000 00:07:12.928 EAL: PCI memory mapped at 0x20200105b000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x20200105c000 00:07:12.928 EAL: PCI memory mapped at 0x20200105d000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:12.928 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:07:12.928 EAL: probe driver: 8086:37c9 qat 00:07:12.928 EAL: PCI memory mapped at 0x20200105e000 00:07:12.928 EAL: PCI memory mapped at 0x20200105f000 00:07:12.928 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: No PCI address specified using 'addr=' in: bus=pci 00:07:12.928 EAL: Mem event callback 'spdk:(nil)' registered 00:07:12.928 00:07:12.928 00:07:12.928 CUnit - A unit testing framework for C - Version 2.1-3 00:07:12.928 http://cunit.sourceforge.net/ 00:07:12.928 00:07:12.928 00:07:12.928 Suite: components_suite 00:07:12.928 Test: vtophys_malloc_test ...passed 00:07:12.928 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 4MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was shrunk by 4MB 00:07:12.928 EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 6MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was shrunk by 6MB 00:07:12.928 EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 10MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was shrunk by 10MB 00:07:12.928 EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 18MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was shrunk by 18MB 00:07:12.928 EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 34MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was shrunk by 34MB 00:07:12.928 EAL: Trying to obtain current memory policy. 00:07:12.928 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:12.928 EAL: Restoring previous memory policy: 4 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:12.928 EAL: request: mp_malloc_sync 00:07:12.928 EAL: No shared files mode enabled, IPC is disabled 00:07:12.928 EAL: Heap on socket 0 was expanded by 66MB 00:07:12.928 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.187 EAL: request: mp_malloc_sync 00:07:13.187 EAL: No shared files mode enabled, IPC is disabled 00:07:13.187 EAL: Heap on socket 0 was shrunk by 66MB 00:07:13.187 EAL: Trying to obtain current memory policy. 00:07:13.187 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:13.187 EAL: Restoring previous memory policy: 4 00:07:13.187 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.187 EAL: request: mp_malloc_sync 00:07:13.187 EAL: No shared files mode enabled, IPC is disabled 00:07:13.187 EAL: Heap on socket 0 was expanded by 130MB 00:07:13.187 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.187 EAL: request: mp_malloc_sync 00:07:13.187 EAL: No shared files mode enabled, IPC is disabled 00:07:13.187 EAL: Heap on socket 0 was shrunk by 130MB 00:07:13.187 EAL: Trying to obtain current memory policy. 00:07:13.187 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:13.187 EAL: Restoring previous memory policy: 4 00:07:13.187 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.187 EAL: request: mp_malloc_sync 00:07:13.187 EAL: No shared files mode enabled, IPC is disabled 00:07:13.187 EAL: Heap on socket 0 was expanded by 258MB 00:07:13.187 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.187 EAL: request: mp_malloc_sync 00:07:13.187 EAL: No shared files mode enabled, IPC is disabled 00:07:13.187 EAL: Heap on socket 0 was shrunk by 258MB 00:07:13.187 EAL: Trying to obtain current memory policy. 00:07:13.187 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:13.446 EAL: Restoring previous memory policy: 4 00:07:13.446 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.446 EAL: request: mp_malloc_sync 00:07:13.446 EAL: No shared files mode enabled, IPC is disabled 00:07:13.446 EAL: Heap on socket 0 was expanded by 514MB 00:07:13.446 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.706 EAL: request: mp_malloc_sync 00:07:13.706 EAL: No shared files mode enabled, IPC is disabled 00:07:13.706 EAL: Heap on socket 0 was shrunk by 514MB 00:07:13.706 EAL: Trying to obtain current memory policy. 00:07:13.706 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:13.965 EAL: Restoring previous memory policy: 4 00:07:13.965 EAL: Calling mem event callback 'spdk:(nil)' 00:07:13.965 EAL: request: mp_malloc_sync 00:07:13.965 EAL: No shared files mode enabled, IPC is disabled 00:07:13.965 EAL: Heap on socket 0 was expanded by 1026MB 00:07:13.965 EAL: Calling mem event callback 'spdk:(nil)' 00:07:14.225 EAL: request: mp_malloc_sync 00:07:14.225 EAL: No shared files mode enabled, IPC is disabled 00:07:14.225 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:14.225 passed 00:07:14.225 00:07:14.225 Run Summary: Type Total Ran Passed Failed Inactive 00:07:14.225 suites 1 1 n/a 0 0 00:07:14.225 tests 2 2 2 0 0 00:07:14.225 asserts 5918 5918 5918 0 n/a 00:07:14.225 00:07:14.225 Elapsed time = 1.184 seconds 00:07:14.225 EAL: No shared files mode enabled, IPC is disabled 00:07:14.225 EAL: No shared files mode enabled, IPC is disabled 00:07:14.225 EAL: No shared files mode enabled, IPC is disabled 00:07:14.225 00:07:14.225 real 0m1.381s 00:07:14.225 user 0m0.775s 00:07:14.225 sys 0m0.575s 00:07:14.225 22:14:24 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.225 22:14:24 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:07:14.225 ************************************ 00:07:14.225 END TEST env_vtophys 00:07:14.225 ************************************ 00:07:14.225 22:14:24 env -- common/autotest_common.sh@1142 -- # return 0 00:07:14.225 22:14:24 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:14.225 22:14:24 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:14.225 22:14:24 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.225 22:14:24 env -- common/autotest_common.sh@10 -- # set +x 00:07:14.225 ************************************ 00:07:14.225 START TEST env_pci 00:07:14.225 ************************************ 00:07:14.225 22:14:24 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:14.225 00:07:14.225 00:07:14.225 CUnit - A unit testing framework for C - Version 2.1-3 00:07:14.225 http://cunit.sourceforge.net/ 00:07:14.225 00:07:14.225 00:07:14.225 Suite: pci 00:07:14.225 Test: pci_hook ...[2024-07-12 22:14:24.512639] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3380362 has claimed it 00:07:14.485 EAL: Cannot find device (10000:00:01.0) 00:07:14.485 EAL: Failed to attach device on primary process 00:07:14.485 passed 00:07:14.485 00:07:14.485 Run Summary: Type Total Ran Passed Failed Inactive 00:07:14.485 suites 1 1 n/a 0 0 00:07:14.485 tests 1 1 1 0 0 00:07:14.485 asserts 25 25 25 0 n/a 00:07:14.485 00:07:14.485 Elapsed time = 0.042 seconds 00:07:14.485 00:07:14.485 real 0m0.070s 00:07:14.485 user 0m0.015s 00:07:14.485 sys 0m0.054s 00:07:14.485 22:14:24 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.485 22:14:24 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:07:14.485 ************************************ 00:07:14.485 END TEST env_pci 00:07:14.485 ************************************ 00:07:14.485 22:14:24 env -- common/autotest_common.sh@1142 -- # return 0 00:07:14.485 22:14:24 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:14.485 22:14:24 env -- env/env.sh@15 -- # uname 00:07:14.485 22:14:24 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:14.485 22:14:24 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:14.485 22:14:24 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:14.485 22:14:24 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:14.485 22:14:24 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.485 22:14:24 env -- common/autotest_common.sh@10 -- # set +x 00:07:14.485 ************************************ 00:07:14.485 START TEST env_dpdk_post_init 00:07:14.485 ************************************ 00:07:14.485 22:14:24 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:14.485 EAL: Detected CPU lcores: 72 00:07:14.485 EAL: Detected NUMA nodes: 2 00:07:14.485 EAL: Detected shared linkage of DPDK 00:07:14.485 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:14.485 EAL: Selected IOVA mode 'PA' 00:07:14.485 EAL: VFIO support initialized 00:07:14.485 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.485 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.485 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.485 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.485 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:07:14.485 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.486 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:07:14.486 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.486 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:14.487 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:07:14.487 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:14.487 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:14.747 EAL: Using IOMMU type 1 (Type 1) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:07:14.747 EAL: Ignore mapping IO port bar(1) 00:07:14.747 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:07:15.007 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Ignore mapping IO port bar(5) 00:07:15.007 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:07:15.007 EAL: Ignore mapping IO port bar(1) 00:07:15.007 EAL: Ignore mapping IO port bar(5) 00:07:15.007 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:07:18.303 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:07:18.303 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:07:18.303 Starting DPDK initialization... 00:07:18.303 Starting SPDK post initialization... 00:07:18.303 SPDK NVMe probe 00:07:18.303 Attaching to 0000:5e:00.0 00:07:18.303 Attached to 0000:5e:00.0 00:07:18.303 Cleaning up... 00:07:18.303 00:07:18.303 real 0m3.505s 00:07:18.303 user 0m2.393s 00:07:18.303 sys 0m0.668s 00:07:18.303 22:14:28 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.303 22:14:28 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:07:18.303 ************************************ 00:07:18.303 END TEST env_dpdk_post_init 00:07:18.303 ************************************ 00:07:18.303 22:14:28 env -- common/autotest_common.sh@1142 -- # return 0 00:07:18.303 22:14:28 env -- env/env.sh@26 -- # uname 00:07:18.303 22:14:28 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:18.303 22:14:28 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:18.303 22:14:28 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:18.303 22:14:28 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.303 22:14:28 env -- common/autotest_common.sh@10 -- # set +x 00:07:18.303 ************************************ 00:07:18.303 START TEST env_mem_callbacks 00:07:18.303 ************************************ 00:07:18.303 22:14:28 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:18.303 EAL: Detected CPU lcores: 72 00:07:18.303 EAL: Detected NUMA nodes: 2 00:07:18.303 EAL: Detected shared linkage of DPDK 00:07:18.303 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:18.303 EAL: Selected IOVA mode 'PA' 00:07:18.303 EAL: VFIO support initialized 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.303 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:07:18.303 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.303 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.304 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:07:18.304 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.304 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:18.305 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:07:18.305 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:18.305 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:18.305 00:07:18.305 00:07:18.305 CUnit - A unit testing framework for C - Version 2.1-3 00:07:18.305 http://cunit.sourceforge.net/ 00:07:18.305 00:07:18.305 00:07:18.305 Suite: memory 00:07:18.305 Test: test ... 00:07:18.305 register 0x200000200000 2097152 00:07:18.305 register 0x201000a00000 2097152 00:07:18.305 malloc 3145728 00:07:18.305 register 0x200000400000 4194304 00:07:18.305 buf 0x200000500000 len 3145728 PASSED 00:07:18.305 malloc 64 00:07:18.305 buf 0x2000004fff40 len 64 PASSED 00:07:18.305 malloc 4194304 00:07:18.305 register 0x200000800000 6291456 00:07:18.305 buf 0x200000a00000 len 4194304 PASSED 00:07:18.305 free 0x200000500000 3145728 00:07:18.305 free 0x2000004fff40 64 00:07:18.305 unregister 0x200000400000 4194304 PASSED 00:07:18.305 free 0x200000a00000 4194304 00:07:18.305 unregister 0x200000800000 6291456 PASSED 00:07:18.305 malloc 8388608 00:07:18.305 register 0x200000400000 10485760 00:07:18.305 buf 0x200000600000 len 8388608 PASSED 00:07:18.305 free 0x200000600000 8388608 00:07:18.305 unregister 0x200000400000 10485760 PASSED 00:07:18.305 passed 00:07:18.305 00:07:18.305 Run Summary: Type Total Ran Passed Failed Inactive 00:07:18.305 suites 1 1 n/a 0 0 00:07:18.305 tests 1 1 1 0 0 00:07:18.305 asserts 16 16 16 0 n/a 00:07:18.305 00:07:18.305 Elapsed time = 0.008 seconds 00:07:18.305 00:07:18.305 real 0m0.111s 00:07:18.305 user 0m0.033s 00:07:18.305 sys 0m0.078s 00:07:18.305 22:14:28 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.305 22:14:28 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:07:18.305 ************************************ 00:07:18.305 END TEST env_mem_callbacks 00:07:18.305 ************************************ 00:07:18.305 22:14:28 env -- common/autotest_common.sh@1142 -- # return 0 00:07:18.305 00:07:18.305 real 0m5.762s 00:07:18.305 user 0m3.584s 00:07:18.305 sys 0m1.739s 00:07:18.305 22:14:28 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.305 22:14:28 env -- common/autotest_common.sh@10 -- # set +x 00:07:18.305 ************************************ 00:07:18.305 END TEST env 00:07:18.305 ************************************ 00:07:18.305 22:14:28 -- common/autotest_common.sh@1142 -- # return 0 00:07:18.305 22:14:28 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:18.305 22:14:28 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:18.305 22:14:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.305 22:14:28 -- common/autotest_common.sh@10 -- # set +x 00:07:18.305 ************************************ 00:07:18.305 START TEST rpc 00:07:18.305 ************************************ 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:18.305 * Looking for test storage... 00:07:18.305 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:18.305 22:14:28 rpc -- rpc/rpc.sh@65 -- # spdk_pid=3381010 00:07:18.305 22:14:28 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:18.305 22:14:28 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:18.305 22:14:28 rpc -- rpc/rpc.sh@67 -- # waitforlisten 3381010 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@829 -- # '[' -z 3381010 ']' 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:18.305 22:14:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:18.305 [2024-07-12 22:14:28.602706] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:18.305 [2024-07-12 22:14:28.602780] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3381010 ] 00:07:18.565 [2024-07-12 22:14:28.734301] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.565 [2024-07-12 22:14:28.835142] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:18.565 [2024-07-12 22:14:28.835196] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3381010' to capture a snapshot of events at runtime. 00:07:18.565 [2024-07-12 22:14:28.835211] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:18.565 [2024-07-12 22:14:28.835225] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:18.565 [2024-07-12 22:14:28.835236] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3381010 for offline analysis/debug. 00:07:18.565 [2024-07-12 22:14:28.835269] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.503 22:14:29 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:19.503 22:14:29 rpc -- common/autotest_common.sh@862 -- # return 0 00:07:19.503 22:14:29 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:19.503 22:14:29 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:19.503 22:14:29 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:19.503 22:14:29 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:19.503 22:14:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:19.503 22:14:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.503 22:14:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:19.503 ************************************ 00:07:19.503 START TEST rpc_integrity 00:07:19.503 ************************************ 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.503 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.503 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:19.503 { 00:07:19.503 "name": "Malloc0", 00:07:19.503 "aliases": [ 00:07:19.503 "4fc7b60f-eb1d-4fa5-b35f-0ccf32f08ac1" 00:07:19.503 ], 00:07:19.503 "product_name": "Malloc disk", 00:07:19.503 "block_size": 512, 00:07:19.503 "num_blocks": 16384, 00:07:19.503 "uuid": "4fc7b60f-eb1d-4fa5-b35f-0ccf32f08ac1", 00:07:19.503 "assigned_rate_limits": { 00:07:19.503 "rw_ios_per_sec": 0, 00:07:19.503 "rw_mbytes_per_sec": 0, 00:07:19.503 "r_mbytes_per_sec": 0, 00:07:19.503 "w_mbytes_per_sec": 0 00:07:19.503 }, 00:07:19.503 "claimed": false, 00:07:19.504 "zoned": false, 00:07:19.504 "supported_io_types": { 00:07:19.504 "read": true, 00:07:19.504 "write": true, 00:07:19.504 "unmap": true, 00:07:19.504 "flush": true, 00:07:19.504 "reset": true, 00:07:19.504 "nvme_admin": false, 00:07:19.504 "nvme_io": false, 00:07:19.504 "nvme_io_md": false, 00:07:19.504 "write_zeroes": true, 00:07:19.504 "zcopy": true, 00:07:19.504 "get_zone_info": false, 00:07:19.504 "zone_management": false, 00:07:19.504 "zone_append": false, 00:07:19.504 "compare": false, 00:07:19.504 "compare_and_write": false, 00:07:19.504 "abort": true, 00:07:19.504 "seek_hole": false, 00:07:19.504 "seek_data": false, 00:07:19.504 "copy": true, 00:07:19.504 "nvme_iov_md": false 00:07:19.504 }, 00:07:19.504 "memory_domains": [ 00:07:19.504 { 00:07:19.504 "dma_device_id": "system", 00:07:19.504 "dma_device_type": 1 00:07:19.504 }, 00:07:19.504 { 00:07:19.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.504 "dma_device_type": 2 00:07:19.504 } 00:07:19.504 ], 00:07:19.504 "driver_specific": {} 00:07:19.504 } 00:07:19.504 ]' 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.504 [2024-07-12 22:14:29.710909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:19.504 [2024-07-12 22:14:29.710962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:19.504 [2024-07-12 22:14:29.710983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b0feb0 00:07:19.504 [2024-07-12 22:14:29.710996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:19.504 [2024-07-12 22:14:29.712485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:19.504 [2024-07-12 22:14:29.712514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:19.504 Passthru0 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:19.504 { 00:07:19.504 "name": "Malloc0", 00:07:19.504 "aliases": [ 00:07:19.504 "4fc7b60f-eb1d-4fa5-b35f-0ccf32f08ac1" 00:07:19.504 ], 00:07:19.504 "product_name": "Malloc disk", 00:07:19.504 "block_size": 512, 00:07:19.504 "num_blocks": 16384, 00:07:19.504 "uuid": "4fc7b60f-eb1d-4fa5-b35f-0ccf32f08ac1", 00:07:19.504 "assigned_rate_limits": { 00:07:19.504 "rw_ios_per_sec": 0, 00:07:19.504 "rw_mbytes_per_sec": 0, 00:07:19.504 "r_mbytes_per_sec": 0, 00:07:19.504 "w_mbytes_per_sec": 0 00:07:19.504 }, 00:07:19.504 "claimed": true, 00:07:19.504 "claim_type": "exclusive_write", 00:07:19.504 "zoned": false, 00:07:19.504 "supported_io_types": { 00:07:19.504 "read": true, 00:07:19.504 "write": true, 00:07:19.504 "unmap": true, 00:07:19.504 "flush": true, 00:07:19.504 "reset": true, 00:07:19.504 "nvme_admin": false, 00:07:19.504 "nvme_io": false, 00:07:19.504 "nvme_io_md": false, 00:07:19.504 "write_zeroes": true, 00:07:19.504 "zcopy": true, 00:07:19.504 "get_zone_info": false, 00:07:19.504 "zone_management": false, 00:07:19.504 "zone_append": false, 00:07:19.504 "compare": false, 00:07:19.504 "compare_and_write": false, 00:07:19.504 "abort": true, 00:07:19.504 "seek_hole": false, 00:07:19.504 "seek_data": false, 00:07:19.504 "copy": true, 00:07:19.504 "nvme_iov_md": false 00:07:19.504 }, 00:07:19.504 "memory_domains": [ 00:07:19.504 { 00:07:19.504 "dma_device_id": "system", 00:07:19.504 "dma_device_type": 1 00:07:19.504 }, 00:07:19.504 { 00:07:19.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.504 "dma_device_type": 2 00:07:19.504 } 00:07:19.504 ], 00:07:19.504 "driver_specific": {} 00:07:19.504 }, 00:07:19.504 { 00:07:19.504 "name": "Passthru0", 00:07:19.504 "aliases": [ 00:07:19.504 "1aa25fae-29c8-5828-8b35-b5a7b927a1f7" 00:07:19.504 ], 00:07:19.504 "product_name": "passthru", 00:07:19.504 "block_size": 512, 00:07:19.504 "num_blocks": 16384, 00:07:19.504 "uuid": "1aa25fae-29c8-5828-8b35-b5a7b927a1f7", 00:07:19.504 "assigned_rate_limits": { 00:07:19.504 "rw_ios_per_sec": 0, 00:07:19.504 "rw_mbytes_per_sec": 0, 00:07:19.504 "r_mbytes_per_sec": 0, 00:07:19.504 "w_mbytes_per_sec": 0 00:07:19.504 }, 00:07:19.504 "claimed": false, 00:07:19.504 "zoned": false, 00:07:19.504 "supported_io_types": { 00:07:19.504 "read": true, 00:07:19.504 "write": true, 00:07:19.504 "unmap": true, 00:07:19.504 "flush": true, 00:07:19.504 "reset": true, 00:07:19.504 "nvme_admin": false, 00:07:19.504 "nvme_io": false, 00:07:19.504 "nvme_io_md": false, 00:07:19.504 "write_zeroes": true, 00:07:19.504 "zcopy": true, 00:07:19.504 "get_zone_info": false, 00:07:19.504 "zone_management": false, 00:07:19.504 "zone_append": false, 00:07:19.504 "compare": false, 00:07:19.504 "compare_and_write": false, 00:07:19.504 "abort": true, 00:07:19.504 "seek_hole": false, 00:07:19.504 "seek_data": false, 00:07:19.504 "copy": true, 00:07:19.504 "nvme_iov_md": false 00:07:19.504 }, 00:07:19.504 "memory_domains": [ 00:07:19.504 { 00:07:19.504 "dma_device_id": "system", 00:07:19.504 "dma_device_type": 1 00:07:19.504 }, 00:07:19.504 { 00:07:19.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.504 "dma_device_type": 2 00:07:19.504 } 00:07:19.504 ], 00:07:19.504 "driver_specific": { 00:07:19.504 "passthru": { 00:07:19.504 "name": "Passthru0", 00:07:19.504 "base_bdev_name": "Malloc0" 00:07:19.504 } 00:07:19.504 } 00:07:19.504 } 00:07:19.504 ]' 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.504 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:19.504 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:19.765 22:14:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:19.765 00:07:19.765 real 0m0.304s 00:07:19.765 user 0m0.188s 00:07:19.765 sys 0m0.047s 00:07:19.765 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.765 22:14:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 ************************************ 00:07:19.765 END TEST rpc_integrity 00:07:19.765 ************************************ 00:07:19.765 22:14:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:19.765 22:14:29 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:19.765 22:14:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:19.765 22:14:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.765 22:14:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 ************************************ 00:07:19.765 START TEST rpc_plugins 00:07:19.765 ************************************ 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:07:19.765 22:14:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.765 22:14:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:19.765 22:14:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 22:14:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.765 22:14:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:19.765 { 00:07:19.765 "name": "Malloc1", 00:07:19.765 "aliases": [ 00:07:19.765 "b27fec43-65c5-4e5f-a1a1-c8f1b6756e7f" 00:07:19.765 ], 00:07:19.765 "product_name": "Malloc disk", 00:07:19.765 "block_size": 4096, 00:07:19.765 "num_blocks": 256, 00:07:19.765 "uuid": "b27fec43-65c5-4e5f-a1a1-c8f1b6756e7f", 00:07:19.765 "assigned_rate_limits": { 00:07:19.765 "rw_ios_per_sec": 0, 00:07:19.765 "rw_mbytes_per_sec": 0, 00:07:19.765 "r_mbytes_per_sec": 0, 00:07:19.765 "w_mbytes_per_sec": 0 00:07:19.765 }, 00:07:19.765 "claimed": false, 00:07:19.765 "zoned": false, 00:07:19.765 "supported_io_types": { 00:07:19.765 "read": true, 00:07:19.765 "write": true, 00:07:19.765 "unmap": true, 00:07:19.765 "flush": true, 00:07:19.765 "reset": true, 00:07:19.765 "nvme_admin": false, 00:07:19.765 "nvme_io": false, 00:07:19.765 "nvme_io_md": false, 00:07:19.765 "write_zeroes": true, 00:07:19.765 "zcopy": true, 00:07:19.765 "get_zone_info": false, 00:07:19.765 "zone_management": false, 00:07:19.765 "zone_append": false, 00:07:19.765 "compare": false, 00:07:19.765 "compare_and_write": false, 00:07:19.765 "abort": true, 00:07:19.765 "seek_hole": false, 00:07:19.765 "seek_data": false, 00:07:19.765 "copy": true, 00:07:19.765 "nvme_iov_md": false 00:07:19.765 }, 00:07:19.765 "memory_domains": [ 00:07:19.765 { 00:07:19.765 "dma_device_id": "system", 00:07:19.765 "dma_device_type": 1 00:07:19.765 }, 00:07:19.765 { 00:07:19.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.765 "dma_device_type": 2 00:07:19.765 } 00:07:19.765 ], 00:07:19.765 "driver_specific": {} 00:07:19.765 } 00:07:19.765 ]' 00:07:19.765 22:14:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:07:19.765 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:19.765 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.765 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:19.765 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.765 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:19.765 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:07:20.026 22:14:30 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:20.026 00:07:20.026 real 0m0.157s 00:07:20.026 user 0m0.099s 00:07:20.026 sys 0m0.019s 00:07:20.026 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.026 22:14:30 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:20.026 ************************************ 00:07:20.026 END TEST rpc_plugins 00:07:20.026 ************************************ 00:07:20.026 22:14:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:20.026 22:14:30 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:20.026 22:14:30 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:20.026 22:14:30 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.026 22:14:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:20.026 ************************************ 00:07:20.026 START TEST rpc_trace_cmd_test 00:07:20.026 ************************************ 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:07:20.026 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3381010", 00:07:20.026 "tpoint_group_mask": "0x8", 00:07:20.026 "iscsi_conn": { 00:07:20.026 "mask": "0x2", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "scsi": { 00:07:20.026 "mask": "0x4", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "bdev": { 00:07:20.026 "mask": "0x8", 00:07:20.026 "tpoint_mask": "0xffffffffffffffff" 00:07:20.026 }, 00:07:20.026 "nvmf_rdma": { 00:07:20.026 "mask": "0x10", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "nvmf_tcp": { 00:07:20.026 "mask": "0x20", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "ftl": { 00:07:20.026 "mask": "0x40", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "blobfs": { 00:07:20.026 "mask": "0x80", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "dsa": { 00:07:20.026 "mask": "0x200", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "thread": { 00:07:20.026 "mask": "0x400", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "nvme_pcie": { 00:07:20.026 "mask": "0x800", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "iaa": { 00:07:20.026 "mask": "0x1000", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "nvme_tcp": { 00:07:20.026 "mask": "0x2000", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "bdev_nvme": { 00:07:20.026 "mask": "0x4000", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 }, 00:07:20.026 "sock": { 00:07:20.026 "mask": "0x8000", 00:07:20.026 "tpoint_mask": "0x0" 00:07:20.026 } 00:07:20.026 }' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:20.026 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:20.287 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:20.287 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:20.287 22:14:30 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:20.287 00:07:20.287 real 0m0.229s 00:07:20.287 user 0m0.193s 00:07:20.287 sys 0m0.029s 00:07:20.287 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.287 22:14:30 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:20.287 ************************************ 00:07:20.287 END TEST rpc_trace_cmd_test 00:07:20.287 ************************************ 00:07:20.287 22:14:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:20.287 22:14:30 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:20.287 22:14:30 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:20.287 22:14:30 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:20.287 22:14:30 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:20.287 22:14:30 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.287 22:14:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:20.287 ************************************ 00:07:20.287 START TEST rpc_daemon_integrity 00:07:20.287 ************************************ 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:20.287 { 00:07:20.287 "name": "Malloc2", 00:07:20.287 "aliases": [ 00:07:20.287 "6cb54afc-7632-4205-9ad8-181116f023ea" 00:07:20.287 ], 00:07:20.287 "product_name": "Malloc disk", 00:07:20.287 "block_size": 512, 00:07:20.287 "num_blocks": 16384, 00:07:20.287 "uuid": "6cb54afc-7632-4205-9ad8-181116f023ea", 00:07:20.287 "assigned_rate_limits": { 00:07:20.287 "rw_ios_per_sec": 0, 00:07:20.287 "rw_mbytes_per_sec": 0, 00:07:20.287 "r_mbytes_per_sec": 0, 00:07:20.287 "w_mbytes_per_sec": 0 00:07:20.287 }, 00:07:20.287 "claimed": false, 00:07:20.287 "zoned": false, 00:07:20.287 "supported_io_types": { 00:07:20.287 "read": true, 00:07:20.287 "write": true, 00:07:20.287 "unmap": true, 00:07:20.287 "flush": true, 00:07:20.287 "reset": true, 00:07:20.287 "nvme_admin": false, 00:07:20.287 "nvme_io": false, 00:07:20.287 "nvme_io_md": false, 00:07:20.287 "write_zeroes": true, 00:07:20.287 "zcopy": true, 00:07:20.287 "get_zone_info": false, 00:07:20.287 "zone_management": false, 00:07:20.287 "zone_append": false, 00:07:20.287 "compare": false, 00:07:20.287 "compare_and_write": false, 00:07:20.287 "abort": true, 00:07:20.287 "seek_hole": false, 00:07:20.287 "seek_data": false, 00:07:20.287 "copy": true, 00:07:20.287 "nvme_iov_md": false 00:07:20.287 }, 00:07:20.287 "memory_domains": [ 00:07:20.287 { 00:07:20.287 "dma_device_id": "system", 00:07:20.287 "dma_device_type": 1 00:07:20.287 }, 00:07:20.287 { 00:07:20.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:20.287 "dma_device_type": 2 00:07:20.287 } 00:07:20.287 ], 00:07:20.287 "driver_specific": {} 00:07:20.287 } 00:07:20.287 ]' 00:07:20.287 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.547 [2024-07-12 22:14:30.637555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:20.547 [2024-07-12 22:14:30.637600] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:20.547 [2024-07-12 22:14:30.637625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b10b20 00:07:20.547 [2024-07-12 22:14:30.637638] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:20.547 [2024-07-12 22:14:30.639018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:20.547 [2024-07-12 22:14:30.639047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:20.547 Passthru0 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.547 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:20.547 { 00:07:20.547 "name": "Malloc2", 00:07:20.547 "aliases": [ 00:07:20.547 "6cb54afc-7632-4205-9ad8-181116f023ea" 00:07:20.547 ], 00:07:20.547 "product_name": "Malloc disk", 00:07:20.547 "block_size": 512, 00:07:20.547 "num_blocks": 16384, 00:07:20.547 "uuid": "6cb54afc-7632-4205-9ad8-181116f023ea", 00:07:20.547 "assigned_rate_limits": { 00:07:20.547 "rw_ios_per_sec": 0, 00:07:20.547 "rw_mbytes_per_sec": 0, 00:07:20.547 "r_mbytes_per_sec": 0, 00:07:20.547 "w_mbytes_per_sec": 0 00:07:20.547 }, 00:07:20.547 "claimed": true, 00:07:20.547 "claim_type": "exclusive_write", 00:07:20.547 "zoned": false, 00:07:20.547 "supported_io_types": { 00:07:20.547 "read": true, 00:07:20.547 "write": true, 00:07:20.547 "unmap": true, 00:07:20.547 "flush": true, 00:07:20.547 "reset": true, 00:07:20.547 "nvme_admin": false, 00:07:20.547 "nvme_io": false, 00:07:20.547 "nvme_io_md": false, 00:07:20.547 "write_zeroes": true, 00:07:20.547 "zcopy": true, 00:07:20.547 "get_zone_info": false, 00:07:20.547 "zone_management": false, 00:07:20.547 "zone_append": false, 00:07:20.547 "compare": false, 00:07:20.547 "compare_and_write": false, 00:07:20.547 "abort": true, 00:07:20.547 "seek_hole": false, 00:07:20.547 "seek_data": false, 00:07:20.547 "copy": true, 00:07:20.547 "nvme_iov_md": false 00:07:20.547 }, 00:07:20.547 "memory_domains": [ 00:07:20.547 { 00:07:20.547 "dma_device_id": "system", 00:07:20.547 "dma_device_type": 1 00:07:20.547 }, 00:07:20.547 { 00:07:20.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:20.548 "dma_device_type": 2 00:07:20.548 } 00:07:20.548 ], 00:07:20.548 "driver_specific": {} 00:07:20.548 }, 00:07:20.548 { 00:07:20.548 "name": "Passthru0", 00:07:20.548 "aliases": [ 00:07:20.548 "31e9f339-27a2-5cde-a1b2-0d8112a0cf85" 00:07:20.548 ], 00:07:20.548 "product_name": "passthru", 00:07:20.548 "block_size": 512, 00:07:20.548 "num_blocks": 16384, 00:07:20.548 "uuid": "31e9f339-27a2-5cde-a1b2-0d8112a0cf85", 00:07:20.548 "assigned_rate_limits": { 00:07:20.548 "rw_ios_per_sec": 0, 00:07:20.548 "rw_mbytes_per_sec": 0, 00:07:20.548 "r_mbytes_per_sec": 0, 00:07:20.548 "w_mbytes_per_sec": 0 00:07:20.548 }, 00:07:20.548 "claimed": false, 00:07:20.548 "zoned": false, 00:07:20.548 "supported_io_types": { 00:07:20.548 "read": true, 00:07:20.548 "write": true, 00:07:20.548 "unmap": true, 00:07:20.548 "flush": true, 00:07:20.548 "reset": true, 00:07:20.548 "nvme_admin": false, 00:07:20.548 "nvme_io": false, 00:07:20.548 "nvme_io_md": false, 00:07:20.548 "write_zeroes": true, 00:07:20.548 "zcopy": true, 00:07:20.548 "get_zone_info": false, 00:07:20.548 "zone_management": false, 00:07:20.548 "zone_append": false, 00:07:20.548 "compare": false, 00:07:20.548 "compare_and_write": false, 00:07:20.548 "abort": true, 00:07:20.548 "seek_hole": false, 00:07:20.548 "seek_data": false, 00:07:20.548 "copy": true, 00:07:20.548 "nvme_iov_md": false 00:07:20.548 }, 00:07:20.548 "memory_domains": [ 00:07:20.548 { 00:07:20.548 "dma_device_id": "system", 00:07:20.548 "dma_device_type": 1 00:07:20.548 }, 00:07:20.548 { 00:07:20.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:20.548 "dma_device_type": 2 00:07:20.548 } 00:07:20.548 ], 00:07:20.548 "driver_specific": { 00:07:20.548 "passthru": { 00:07:20.548 "name": "Passthru0", 00:07:20.548 "base_bdev_name": "Malloc2" 00:07:20.548 } 00:07:20.548 } 00:07:20.548 } 00:07:20.548 ]' 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:20.548 00:07:20.548 real 0m0.298s 00:07:20.548 user 0m0.184s 00:07:20.548 sys 0m0.048s 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.548 22:14:30 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:20.548 ************************************ 00:07:20.548 END TEST rpc_daemon_integrity 00:07:20.548 ************************************ 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:20.548 22:14:30 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:20.548 22:14:30 rpc -- rpc/rpc.sh@84 -- # killprocess 3381010 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@948 -- # '[' -z 3381010 ']' 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@952 -- # kill -0 3381010 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@953 -- # uname 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:20.548 22:14:30 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3381010 00:07:20.807 22:14:30 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:20.807 22:14:30 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:20.807 22:14:30 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3381010' 00:07:20.807 killing process with pid 3381010 00:07:20.807 22:14:30 rpc -- common/autotest_common.sh@967 -- # kill 3381010 00:07:20.807 22:14:30 rpc -- common/autotest_common.sh@972 -- # wait 3381010 00:07:21.066 00:07:21.066 real 0m2.839s 00:07:21.066 user 0m3.559s 00:07:21.066 sys 0m0.933s 00:07:21.066 22:14:31 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.066 22:14:31 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:21.066 ************************************ 00:07:21.066 END TEST rpc 00:07:21.066 ************************************ 00:07:21.066 22:14:31 -- common/autotest_common.sh@1142 -- # return 0 00:07:21.066 22:14:31 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:21.066 22:14:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:21.066 22:14:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.066 22:14:31 -- common/autotest_common.sh@10 -- # set +x 00:07:21.066 ************************************ 00:07:21.066 START TEST skip_rpc 00:07:21.066 ************************************ 00:07:21.066 22:14:31 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:21.325 * Looking for test storage... 00:07:21.325 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:21.325 22:14:31 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:21.325 22:14:31 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:21.325 22:14:31 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:07:21.325 22:14:31 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:21.325 22:14:31 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.325 22:14:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:21.325 ************************************ 00:07:21.325 START TEST skip_rpc 00:07:21.325 ************************************ 00:07:21.325 22:14:31 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:07:21.325 22:14:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=3381545 00:07:21.325 22:14:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:21.325 22:14:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:07:21.325 22:14:31 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:07:21.325 [2024-07-12 22:14:31.580548] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:21.325 [2024-07-12 22:14:31.580611] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3381545 ] 00:07:21.585 [2024-07-12 22:14:31.710807] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.585 [2024-07-12 22:14:31.811108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 3381545 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 3381545 ']' 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 3381545 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3381545 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3381545' 00:07:26.861 killing process with pid 3381545 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 3381545 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 3381545 00:07:26.861 00:07:26.861 real 0m5.438s 00:07:26.861 user 0m5.091s 00:07:26.861 sys 0m0.358s 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.861 22:14:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.861 ************************************ 00:07:26.861 END TEST skip_rpc 00:07:26.861 ************************************ 00:07:26.861 22:14:36 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:26.861 22:14:36 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:26.861 22:14:36 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:26.861 22:14:36 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.861 22:14:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.861 ************************************ 00:07:26.861 START TEST skip_rpc_with_json 00:07:26.861 ************************************ 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=3382274 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 3382274 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 3382274 ']' 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:26.861 22:14:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:26.861 [2024-07-12 22:14:37.103591] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:26.861 [2024-07-12 22:14:37.103661] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3382274 ] 00:07:27.120 [2024-07-12 22:14:37.233839] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.120 [2024-07-12 22:14:37.330958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:27.749 [2024-07-12 22:14:38.030747] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:27.749 request: 00:07:27.749 { 00:07:27.749 "trtype": "tcp", 00:07:27.749 "method": "nvmf_get_transports", 00:07:27.749 "req_id": 1 00:07:27.749 } 00:07:27.749 Got JSON-RPC error response 00:07:27.749 response: 00:07:27.749 { 00:07:27.749 "code": -19, 00:07:27.749 "message": "No such device" 00:07:27.749 } 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:27.749 [2024-07-12 22:14:38.038887] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:27.749 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:28.009 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:28.009 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:28.009 { 00:07:28.009 "subsystems": [ 00:07:28.009 { 00:07:28.009 "subsystem": "keyring", 00:07:28.009 "config": [] 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "subsystem": "iobuf", 00:07:28.009 "config": [ 00:07:28.009 { 00:07:28.009 "method": "iobuf_set_options", 00:07:28.009 "params": { 00:07:28.009 "small_pool_count": 8192, 00:07:28.009 "large_pool_count": 1024, 00:07:28.009 "small_bufsize": 8192, 00:07:28.009 "large_bufsize": 135168 00:07:28.009 } 00:07:28.009 } 00:07:28.009 ] 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "subsystem": "sock", 00:07:28.009 "config": [ 00:07:28.009 { 00:07:28.009 "method": "sock_set_default_impl", 00:07:28.009 "params": { 00:07:28.009 "impl_name": "posix" 00:07:28.009 } 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "method": "sock_impl_set_options", 00:07:28.009 "params": { 00:07:28.009 "impl_name": "ssl", 00:07:28.009 "recv_buf_size": 4096, 00:07:28.009 "send_buf_size": 4096, 00:07:28.009 "enable_recv_pipe": true, 00:07:28.009 "enable_quickack": false, 00:07:28.009 "enable_placement_id": 0, 00:07:28.009 "enable_zerocopy_send_server": true, 00:07:28.009 "enable_zerocopy_send_client": false, 00:07:28.009 "zerocopy_threshold": 0, 00:07:28.009 "tls_version": 0, 00:07:28.009 "enable_ktls": false 00:07:28.009 } 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "method": "sock_impl_set_options", 00:07:28.009 "params": { 00:07:28.009 "impl_name": "posix", 00:07:28.009 "recv_buf_size": 2097152, 00:07:28.009 "send_buf_size": 2097152, 00:07:28.009 "enable_recv_pipe": true, 00:07:28.009 "enable_quickack": false, 00:07:28.009 "enable_placement_id": 0, 00:07:28.009 "enable_zerocopy_send_server": true, 00:07:28.009 "enable_zerocopy_send_client": false, 00:07:28.009 "zerocopy_threshold": 0, 00:07:28.009 "tls_version": 0, 00:07:28.009 "enable_ktls": false 00:07:28.009 } 00:07:28.009 } 00:07:28.009 ] 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "subsystem": "vmd", 00:07:28.009 "config": [] 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "subsystem": "accel", 00:07:28.009 "config": [ 00:07:28.009 { 00:07:28.009 "method": "accel_set_options", 00:07:28.009 "params": { 00:07:28.009 "small_cache_size": 128, 00:07:28.009 "large_cache_size": 16, 00:07:28.009 "task_count": 2048, 00:07:28.009 "sequence_count": 2048, 00:07:28.009 "buf_count": 2048 00:07:28.009 } 00:07:28.009 } 00:07:28.009 ] 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "subsystem": "bdev", 00:07:28.009 "config": [ 00:07:28.009 { 00:07:28.009 "method": "bdev_set_options", 00:07:28.009 "params": { 00:07:28.009 "bdev_io_pool_size": 65535, 00:07:28.009 "bdev_io_cache_size": 256, 00:07:28.009 "bdev_auto_examine": true, 00:07:28.009 "iobuf_small_cache_size": 128, 00:07:28.009 "iobuf_large_cache_size": 16 00:07:28.009 } 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "method": "bdev_raid_set_options", 00:07:28.009 "params": { 00:07:28.009 "process_window_size_kb": 1024 00:07:28.009 } 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "method": "bdev_iscsi_set_options", 00:07:28.009 "params": { 00:07:28.009 "timeout_sec": 30 00:07:28.009 } 00:07:28.009 }, 00:07:28.009 { 00:07:28.009 "method": "bdev_nvme_set_options", 00:07:28.009 "params": { 00:07:28.009 "action_on_timeout": "none", 00:07:28.010 "timeout_us": 0, 00:07:28.010 "timeout_admin_us": 0, 00:07:28.010 "keep_alive_timeout_ms": 10000, 00:07:28.010 "arbitration_burst": 0, 00:07:28.010 "low_priority_weight": 0, 00:07:28.010 "medium_priority_weight": 0, 00:07:28.010 "high_priority_weight": 0, 00:07:28.010 "nvme_adminq_poll_period_us": 10000, 00:07:28.010 "nvme_ioq_poll_period_us": 0, 00:07:28.010 "io_queue_requests": 0, 00:07:28.010 "delay_cmd_submit": true, 00:07:28.010 "transport_retry_count": 4, 00:07:28.010 "bdev_retry_count": 3, 00:07:28.010 "transport_ack_timeout": 0, 00:07:28.010 "ctrlr_loss_timeout_sec": 0, 00:07:28.010 "reconnect_delay_sec": 0, 00:07:28.010 "fast_io_fail_timeout_sec": 0, 00:07:28.010 "disable_auto_failback": false, 00:07:28.010 "generate_uuids": false, 00:07:28.010 "transport_tos": 0, 00:07:28.010 "nvme_error_stat": false, 00:07:28.010 "rdma_srq_size": 0, 00:07:28.010 "io_path_stat": false, 00:07:28.010 "allow_accel_sequence": false, 00:07:28.010 "rdma_max_cq_size": 0, 00:07:28.010 "rdma_cm_event_timeout_ms": 0, 00:07:28.010 "dhchap_digests": [ 00:07:28.010 "sha256", 00:07:28.010 "sha384", 00:07:28.010 "sha512" 00:07:28.010 ], 00:07:28.010 "dhchap_dhgroups": [ 00:07:28.010 "null", 00:07:28.010 "ffdhe2048", 00:07:28.010 "ffdhe3072", 00:07:28.010 "ffdhe4096", 00:07:28.010 "ffdhe6144", 00:07:28.010 "ffdhe8192" 00:07:28.010 ] 00:07:28.010 } 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "method": "bdev_nvme_set_hotplug", 00:07:28.010 "params": { 00:07:28.010 "period_us": 100000, 00:07:28.010 "enable": false 00:07:28.010 } 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "method": "bdev_wait_for_examine" 00:07:28.010 } 00:07:28.010 ] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "scsi", 00:07:28.010 "config": null 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "scheduler", 00:07:28.010 "config": [ 00:07:28.010 { 00:07:28.010 "method": "framework_set_scheduler", 00:07:28.010 "params": { 00:07:28.010 "name": "static" 00:07:28.010 } 00:07:28.010 } 00:07:28.010 ] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "vhost_scsi", 00:07:28.010 "config": [] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "vhost_blk", 00:07:28.010 "config": [] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "ublk", 00:07:28.010 "config": [] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "nbd", 00:07:28.010 "config": [] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "nvmf", 00:07:28.010 "config": [ 00:07:28.010 { 00:07:28.010 "method": "nvmf_set_config", 00:07:28.010 "params": { 00:07:28.010 "discovery_filter": "match_any", 00:07:28.010 "admin_cmd_passthru": { 00:07:28.010 "identify_ctrlr": false 00:07:28.010 } 00:07:28.010 } 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "method": "nvmf_set_max_subsystems", 00:07:28.010 "params": { 00:07:28.010 "max_subsystems": 1024 00:07:28.010 } 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "method": "nvmf_set_crdt", 00:07:28.010 "params": { 00:07:28.010 "crdt1": 0, 00:07:28.010 "crdt2": 0, 00:07:28.010 "crdt3": 0 00:07:28.010 } 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "method": "nvmf_create_transport", 00:07:28.010 "params": { 00:07:28.010 "trtype": "TCP", 00:07:28.010 "max_queue_depth": 128, 00:07:28.010 "max_io_qpairs_per_ctrlr": 127, 00:07:28.010 "in_capsule_data_size": 4096, 00:07:28.010 "max_io_size": 131072, 00:07:28.010 "io_unit_size": 131072, 00:07:28.010 "max_aq_depth": 128, 00:07:28.010 "num_shared_buffers": 511, 00:07:28.010 "buf_cache_size": 4294967295, 00:07:28.010 "dif_insert_or_strip": false, 00:07:28.010 "zcopy": false, 00:07:28.010 "c2h_success": true, 00:07:28.010 "sock_priority": 0, 00:07:28.010 "abort_timeout_sec": 1, 00:07:28.010 "ack_timeout": 0, 00:07:28.010 "data_wr_pool_size": 0 00:07:28.010 } 00:07:28.010 } 00:07:28.010 ] 00:07:28.010 }, 00:07:28.010 { 00:07:28.010 "subsystem": "iscsi", 00:07:28.010 "config": [ 00:07:28.010 { 00:07:28.010 "method": "iscsi_set_options", 00:07:28.010 "params": { 00:07:28.010 "node_base": "iqn.2016-06.io.spdk", 00:07:28.010 "max_sessions": 128, 00:07:28.010 "max_connections_per_session": 2, 00:07:28.010 "max_queue_depth": 64, 00:07:28.010 "default_time2wait": 2, 00:07:28.010 "default_time2retain": 20, 00:07:28.010 "first_burst_length": 8192, 00:07:28.010 "immediate_data": true, 00:07:28.010 "allow_duplicated_isid": false, 00:07:28.010 "error_recovery_level": 0, 00:07:28.010 "nop_timeout": 60, 00:07:28.010 "nop_in_interval": 30, 00:07:28.010 "disable_chap": false, 00:07:28.010 "require_chap": false, 00:07:28.010 "mutual_chap": false, 00:07:28.010 "chap_group": 0, 00:07:28.010 "max_large_datain_per_connection": 64, 00:07:28.010 "max_r2t_per_connection": 4, 00:07:28.010 "pdu_pool_size": 36864, 00:07:28.010 "immediate_data_pool_size": 16384, 00:07:28.010 "data_out_pool_size": 2048 00:07:28.010 } 00:07:28.010 } 00:07:28.010 ] 00:07:28.010 } 00:07:28.010 ] 00:07:28.010 } 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 3382274 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 3382274 ']' 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 3382274 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3382274 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3382274' 00:07:28.010 killing process with pid 3382274 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 3382274 00:07:28.010 22:14:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 3382274 00:07:28.580 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=3382468 00:07:28.580 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:28.580 22:14:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 3382468 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 3382468 ']' 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 3382468 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3382468 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3382468' 00:07:33.887 killing process with pid 3382468 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 3382468 00:07:33.887 22:14:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 3382468 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:33.887 00:07:33.887 real 0m7.024s 00:07:33.887 user 0m6.721s 00:07:33.887 sys 0m0.841s 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:33.887 ************************************ 00:07:33.887 END TEST skip_rpc_with_json 00:07:33.887 ************************************ 00:07:33.887 22:14:44 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:33.887 22:14:44 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:33.887 22:14:44 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:33.887 22:14:44 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.887 22:14:44 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.887 ************************************ 00:07:33.887 START TEST skip_rpc_with_delay 00:07:33.887 ************************************ 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:33.887 [2024-07-12 22:14:44.197544] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:33.887 [2024-07-12 22:14:44.197632] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:33.887 00:07:33.887 real 0m0.079s 00:07:33.887 user 0m0.043s 00:07:33.887 sys 0m0.035s 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.887 22:14:44 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:33.887 ************************************ 00:07:33.887 END TEST skip_rpc_with_delay 00:07:33.887 ************************************ 00:07:34.146 22:14:44 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:34.146 22:14:44 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:34.146 22:14:44 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:34.146 22:14:44 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:34.146 22:14:44 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.146 22:14:44 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.146 22:14:44 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:34.146 ************************************ 00:07:34.146 START TEST exit_on_failed_rpc_init 00:07:34.146 ************************************ 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=3383296 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 3383296 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 3383296 ']' 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.146 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:34.146 22:14:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:34.146 [2024-07-12 22:14:44.355392] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:34.146 [2024-07-12 22:14:44.355459] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3383296 ] 00:07:34.405 [2024-07-12 22:14:44.484574] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.405 [2024-07-12 22:14:44.587159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.973 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.973 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:07:34.973 22:14:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:34.974 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:35.232 [2024-07-12 22:14:45.327738] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:35.232 [2024-07-12 22:14:45.327871] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3383398 ] 00:07:35.232 [2024-07-12 22:14:45.512281] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.491 [2024-07-12 22:14:45.614286] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:35.491 [2024-07-12 22:14:45.614363] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:35.491 [2024-07-12 22:14:45.614380] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:35.491 [2024-07-12 22:14:45.614392] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 3383296 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 3383296 ']' 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 3383296 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3383296 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3383296' 00:07:35.491 killing process with pid 3383296 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 3383296 00:07:35.491 22:14:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 3383296 00:07:36.059 00:07:36.059 real 0m1.839s 00:07:36.059 user 0m2.127s 00:07:36.059 sys 0m0.633s 00:07:36.059 22:14:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.059 22:14:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:36.059 ************************************ 00:07:36.059 END TEST exit_on_failed_rpc_init 00:07:36.059 ************************************ 00:07:36.059 22:14:46 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:36.059 22:14:46 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:36.059 00:07:36.059 real 0m14.812s 00:07:36.059 user 0m14.145s 00:07:36.059 sys 0m2.167s 00:07:36.059 22:14:46 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.059 22:14:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:36.059 ************************************ 00:07:36.059 END TEST skip_rpc 00:07:36.059 ************************************ 00:07:36.059 22:14:46 -- common/autotest_common.sh@1142 -- # return 0 00:07:36.059 22:14:46 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:36.059 22:14:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:36.059 22:14:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.059 22:14:46 -- common/autotest_common.sh@10 -- # set +x 00:07:36.059 ************************************ 00:07:36.059 START TEST rpc_client 00:07:36.059 ************************************ 00:07:36.059 22:14:46 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:36.059 * Looking for test storage... 00:07:36.059 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:07:36.059 22:14:46 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:36.059 OK 00:07:36.319 22:14:46 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:36.319 00:07:36.319 real 0m0.128s 00:07:36.319 user 0m0.047s 00:07:36.319 sys 0m0.089s 00:07:36.319 22:14:46 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.319 22:14:46 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:36.319 ************************************ 00:07:36.319 END TEST rpc_client 00:07:36.319 ************************************ 00:07:36.319 22:14:46 -- common/autotest_common.sh@1142 -- # return 0 00:07:36.319 22:14:46 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:36.319 22:14:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:36.319 22:14:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.319 22:14:46 -- common/autotest_common.sh@10 -- # set +x 00:07:36.319 ************************************ 00:07:36.319 START TEST json_config 00:07:36.319 ************************************ 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:36.319 22:14:46 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:36.319 22:14:46 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:36.319 22:14:46 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:36.319 22:14:46 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.319 22:14:46 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.319 22:14:46 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.319 22:14:46 json_config -- paths/export.sh@5 -- # export PATH 00:07:36.319 22:14:46 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@47 -- # : 0 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:36.319 22:14:46 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:07:36.319 INFO: JSON configuration test init 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:36.319 22:14:46 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:07:36.319 22:14:46 json_config -- json_config/common.sh@9 -- # local app=target 00:07:36.319 22:14:46 json_config -- json_config/common.sh@10 -- # shift 00:07:36.319 22:14:46 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:36.319 22:14:46 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:36.319 22:14:46 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:36.319 22:14:46 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:36.319 22:14:46 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:36.319 22:14:46 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3383686 00:07:36.319 22:14:46 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:36.319 Waiting for target to run... 00:07:36.319 22:14:46 json_config -- json_config/common.sh@25 -- # waitforlisten 3383686 /var/tmp/spdk_tgt.sock 00:07:36.319 22:14:46 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@829 -- # '[' -z 3383686 ']' 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:36.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:36.319 22:14:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:36.578 [2024-07-12 22:14:46.660847] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:36.578 [2024-07-12 22:14:46.660918] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3383686 ] 00:07:36.838 [2024-07-12 22:14:47.027563] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.838 [2024-07-12 22:14:47.119462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.406 22:14:47 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:37.406 22:14:47 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:37.406 22:14:47 json_config -- json_config/common.sh@26 -- # echo '' 00:07:37.406 00:07:37.406 22:14:47 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:07:37.406 22:14:47 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:07:37.406 22:14:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:37.406 22:14:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:37.406 22:14:47 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:07:37.406 22:14:47 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:07:37.406 22:14:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:07:37.664 22:14:47 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:37.664 22:14:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:37.924 [2024-07-12 22:14:48.046327] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:37.924 22:14:48 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:37.924 22:14:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:38.184 [2024-07-12 22:14:48.290972] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:38.184 22:14:48 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:07:38.184 22:14:48 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:38.184 22:14:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:38.184 22:14:48 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:07:38.184 22:14:48 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:07:38.184 22:14:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:07:38.444 [2024-07-12 22:14:48.592492] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:07:40.980 22:14:51 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:40.980 22:14:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:07:40.980 22:14:51 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:07:40.980 22:14:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@48 -- # local get_types 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:07:41.240 22:14:51 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:41.240 22:14:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@55 -- # return 0 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:07:41.240 22:14:51 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:41.240 22:14:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:41.240 22:14:51 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:41.240 22:14:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:07:41.500 22:14:51 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:07:41.500 22:14:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:07:41.759 Nvme0n1p0 Nvme0n1p1 00:07:41.759 22:14:51 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:07:41.759 22:14:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:07:42.018 [2024-07-12 22:14:52.202764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:42.018 [2024-07-12 22:14:52.202822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:42.018 00:07:42.018 22:14:52 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:07:42.018 22:14:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:07:42.278 Malloc3 00:07:42.278 22:14:52 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:42.278 22:14:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:42.537 [2024-07-12 22:14:52.688150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:42.537 [2024-07-12 22:14:52.688203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:42.537 [2024-07-12 22:14:52.688239] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145ba00 00:07:42.537 [2024-07-12 22:14:52.688255] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:42.537 [2024-07-12 22:14:52.689916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:42.537 [2024-07-12 22:14:52.689956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:42.537 PTBdevFromMalloc3 00:07:42.537 22:14:52 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:07:42.537 22:14:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:07:42.795 Null0 00:07:42.795 22:14:52 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:07:42.795 22:14:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:07:43.053 Malloc0 00:07:43.053 22:14:53 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:07:43.053 22:14:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:07:43.313 Malloc1 00:07:43.313 22:14:53 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:07:43.313 22:14:53 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:07:43.573 102400+0 records in 00:07:43.573 102400+0 records out 00:07:43.573 104857600 bytes (105 MB, 100 MiB) copied, 0.312417 s, 336 MB/s 00:07:43.573 22:14:53 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:07:43.573 22:14:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:07:43.832 aio_disk 00:07:43.832 22:14:54 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:07:43.832 22:14:54 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:43.832 22:14:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:49.110 adaceb3e-8111-433d-8a2e-65ace8599018 00:07:49.110 22:14:58 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:49.110 22:14:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:49.110 22:14:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:49.110 22:14:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:49.110 22:14:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:49.110 22:14:59 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:49.110 22:14:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:49.110 22:14:59 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:49.110 22:14:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:49.369 22:14:59 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:07:49.369 22:14:59 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:49.369 22:14:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:49.629 MallocForCryptoBdev 00:07:49.629 22:14:59 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:07:49.629 22:14:59 json_config -- json_config/json_config.sh@159 -- # wc -l 00:07:49.629 22:14:59 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:07:49.629 22:14:59 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:07:49.629 22:14:59 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:49.629 22:14:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:49.889 [2024-07-12 22:15:00.175078] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:49.889 CryptoMallocBdev 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 bdev_register:e787d834-9235-4666-8d28-1be85cb55120 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 bdev_register:e787d834-9235-4666-8d28-1be85cb55120 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@71 -- # sort 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@72 -- # sort 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:49.889 22:15:00 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:49.889 22:15:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.148 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:e787d834-9235-4666-8d28-1be85cb55120 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 bdev_register:e787d834-9235-4666-8d28-1be85cb55120 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\3\3\9\7\0\e\6\-\0\5\7\0\-\4\c\7\7\-\b\a\8\2\-\6\a\4\3\8\1\5\2\d\0\e\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\a\7\9\3\7\f\6\-\1\c\d\8\-\4\8\6\a\-\b\f\2\0\-\8\b\1\4\b\2\c\0\f\9\1\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\f\2\2\9\8\c\0\-\8\4\e\1\-\4\e\7\7\-\b\3\3\b\-\1\1\2\f\e\2\0\a\2\2\a\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\7\8\7\d\8\3\4\-\9\2\3\5\-\4\6\6\6\-\8\d\2\8\-\1\b\e\8\5\c\b\5\5\1\2\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@86 -- # cat 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 bdev_register:e787d834-9235-4666-8d28-1be85cb55120 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:50.149 Expected events matched: 00:07:50.149 bdev_register:033970e6-0570-4c77-ba82-6a438152d0e8 00:07:50.149 bdev_register:aa7937f6-1cd8-486a-bf20-8b14b2c0f91e 00:07:50.149 bdev_register:aio_disk 00:07:50.149 bdev_register:CryptoMallocBdev 00:07:50.149 bdev_register:df2298c0-84e1-4e77-b33b-112fe20a22a4 00:07:50.149 bdev_register:e787d834-9235-4666-8d28-1be85cb55120 00:07:50.149 bdev_register:Malloc0 00:07:50.149 bdev_register:Malloc0p0 00:07:50.149 bdev_register:Malloc0p1 00:07:50.149 bdev_register:Malloc0p2 00:07:50.149 bdev_register:Malloc1 00:07:50.149 bdev_register:Malloc3 00:07:50.149 bdev_register:MallocForCryptoBdev 00:07:50.149 bdev_register:Null0 00:07:50.149 bdev_register:Nvme0n1 00:07:50.149 bdev_register:Nvme0n1p0 00:07:50.149 bdev_register:Nvme0n1p1 00:07:50.149 bdev_register:PTBdevFromMalloc3 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:07:50.149 22:15:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:50.149 22:15:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:50.149 22:15:00 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:07:50.149 22:15:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:50.149 22:15:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:50.409 22:15:00 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:07:50.409 22:15:00 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:50.409 22:15:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:50.668 MallocBdevForConfigChangeCheck 00:07:50.668 22:15:00 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:07:50.668 22:15:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:50.668 22:15:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:50.668 22:15:00 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:07:50.668 22:15:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:50.967 22:15:01 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:07:50.967 INFO: shutting down applications... 00:07:50.967 22:15:01 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:07:50.967 22:15:01 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:07:50.967 22:15:01 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:07:50.967 22:15:01 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:51.256 [2024-07-12 22:15:01.302581] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:54.557 Calling clear_iscsi_subsystem 00:07:54.557 Calling clear_nvmf_subsystem 00:07:54.557 Calling clear_nbd_subsystem 00:07:54.557 Calling clear_ublk_subsystem 00:07:54.557 Calling clear_vhost_blk_subsystem 00:07:54.557 Calling clear_vhost_scsi_subsystem 00:07:54.557 Calling clear_bdev_subsystem 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@343 -- # count=100 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@345 -- # break 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:07:54.557 22:15:04 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:07:54.557 22:15:04 json_config -- json_config/common.sh@31 -- # local app=target 00:07:54.557 22:15:04 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:54.557 22:15:04 json_config -- json_config/common.sh@35 -- # [[ -n 3383686 ]] 00:07:54.557 22:15:04 json_config -- json_config/common.sh@38 -- # kill -SIGINT 3383686 00:07:54.557 22:15:04 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:54.557 22:15:04 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:54.557 22:15:04 json_config -- json_config/common.sh@41 -- # kill -0 3383686 00:07:54.557 22:15:04 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:54.816 22:15:05 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:54.816 22:15:05 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:54.816 22:15:05 json_config -- json_config/common.sh@41 -- # kill -0 3383686 00:07:54.816 22:15:05 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:54.816 22:15:05 json_config -- json_config/common.sh@43 -- # break 00:07:54.816 22:15:05 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:54.816 22:15:05 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:54.816 SPDK target shutdown done 00:07:54.816 22:15:05 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:07:54.816 INFO: relaunching applications... 00:07:54.816 22:15:05 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:54.816 22:15:05 json_config -- json_config/common.sh@9 -- # local app=target 00:07:54.816 22:15:05 json_config -- json_config/common.sh@10 -- # shift 00:07:54.816 22:15:05 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:54.816 22:15:05 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:54.816 22:15:05 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:54.816 22:15:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:54.816 22:15:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:54.816 22:15:05 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3386578 00:07:54.816 22:15:05 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:54.816 Waiting for target to run... 00:07:54.816 22:15:05 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:54.816 22:15:05 json_config -- json_config/common.sh@25 -- # waitforlisten 3386578 /var/tmp/spdk_tgt.sock 00:07:54.816 22:15:05 json_config -- common/autotest_common.sh@829 -- # '[' -z 3386578 ']' 00:07:54.816 22:15:05 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:54.816 22:15:05 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:54.816 22:15:05 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:54.816 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:54.816 22:15:05 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:54.817 22:15:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:54.817 [2024-07-12 22:15:05.135423] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:07:54.817 [2024-07-12 22:15:05.135503] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3386578 ] 00:07:55.754 [2024-07-12 22:15:05.759889] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.754 [2024-07-12 22:15:05.857772] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.754 [2024-07-12 22:15:05.911975] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:55.754 [2024-07-12 22:15:05.920006] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:55.754 [2024-07-12 22:15:05.928024] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:55.754 [2024-07-12 22:15:06.009303] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:58.291 [2024-07-12 22:15:08.210845] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:58.291 [2024-07-12 22:15:08.210919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:58.291 [2024-07-12 22:15:08.210948] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:58.291 [2024-07-12 22:15:08.218865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:58.291 [2024-07-12 22:15:08.218903] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:58.291 [2024-07-12 22:15:08.226892] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:58.291 [2024-07-12 22:15:08.226921] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:58.291 [2024-07-12 22:15:08.234913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:58.291 [2024-07-12 22:15:08.234950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:58.291 [2024-07-12 22:15:08.234969] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:58.291 [2024-07-12 22:15:08.608053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:58.291 [2024-07-12 22:15:08.608101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:58.291 [2024-07-12 22:15:08.608124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xff6040 00:07:58.291 [2024-07-12 22:15:08.608144] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:58.291 [2024-07-12 22:15:08.608488] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:58.291 [2024-07-12 22:15:08.608513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:58.550 22:15:08 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:58.550 22:15:08 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:58.550 22:15:08 json_config -- json_config/common.sh@26 -- # echo '' 00:07:58.550 00:07:58.550 22:15:08 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:58.550 22:15:08 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:58.550 INFO: Checking if target configuration is the same... 00:07:58.550 22:15:08 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:58.550 22:15:08 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:58.550 22:15:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:58.550 + '[' 2 -ne 2 ']' 00:07:58.550 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:58.550 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:58.550 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:58.550 +++ basename /dev/fd/62 00:07:58.550 ++ mktemp /tmp/62.XXX 00:07:58.550 + tmp_file_1=/tmp/62.XDP 00:07:58.550 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:58.550 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:58.550 + tmp_file_2=/tmp/spdk_tgt_config.json.tqF 00:07:58.550 + ret=0 00:07:58.550 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:58.809 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:58.809 + diff -u /tmp/62.XDP /tmp/spdk_tgt_config.json.tqF 00:07:58.809 + echo 'INFO: JSON config files are the same' 00:07:58.809 INFO: JSON config files are the same 00:07:58.809 + rm /tmp/62.XDP /tmp/spdk_tgt_config.json.tqF 00:07:58.809 + exit 0 00:07:58.809 22:15:09 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:58.809 22:15:09 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:58.809 INFO: changing configuration and checking if this can be detected... 00:07:58.809 22:15:09 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:58.809 22:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:59.067 22:15:09 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:59.067 22:15:09 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:59.067 22:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:59.067 + '[' 2 -ne 2 ']' 00:07:59.067 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:59.067 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:59.067 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:59.067 +++ basename /dev/fd/62 00:07:59.067 ++ mktemp /tmp/62.XXX 00:07:59.067 + tmp_file_1=/tmp/62.mCn 00:07:59.067 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:59.067 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:59.067 + tmp_file_2=/tmp/spdk_tgt_config.json.2cc 00:07:59.067 + ret=0 00:07:59.067 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:59.326 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:59.585 + diff -u /tmp/62.mCn /tmp/spdk_tgt_config.json.2cc 00:07:59.585 + ret=1 00:07:59.585 + echo '=== Start of file: /tmp/62.mCn ===' 00:07:59.585 + cat /tmp/62.mCn 00:07:59.585 + echo '=== End of file: /tmp/62.mCn ===' 00:07:59.585 + echo '' 00:07:59.585 + echo '=== Start of file: /tmp/spdk_tgt_config.json.2cc ===' 00:07:59.585 + cat /tmp/spdk_tgt_config.json.2cc 00:07:59.585 + echo '=== End of file: /tmp/spdk_tgt_config.json.2cc ===' 00:07:59.585 + echo '' 00:07:59.585 + rm /tmp/62.mCn /tmp/spdk_tgt_config.json.2cc 00:07:59.585 + exit 1 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:59.585 INFO: configuration change detected. 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:59.585 22:15:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:59.585 22:15:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@317 -- # [[ -n 3386578 ]] 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:59.585 22:15:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:59.585 22:15:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:59.585 22:15:09 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:59.585 22:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:59.844 22:15:09 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:59.844 22:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:08:00.102 22:15:10 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:08:00.102 22:15:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:08:00.361 22:15:10 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:08:00.361 22:15:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@193 -- # uname -s 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:00.620 22:15:10 json_config -- json_config/json_config.sh@323 -- # killprocess 3386578 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@948 -- # '[' -z 3386578 ']' 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@952 -- # kill -0 3386578 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@953 -- # uname 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3386578 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3386578' 00:08:00.620 killing process with pid 3386578 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@967 -- # kill 3386578 00:08:00.620 22:15:10 json_config -- common/autotest_common.sh@972 -- # wait 3386578 00:08:03.909 22:15:14 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:03.909 22:15:14 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:08:03.909 22:15:14 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:03.909 22:15:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.909 22:15:14 json_config -- json_config/json_config.sh@328 -- # return 0 00:08:03.909 22:15:14 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:08:03.909 INFO: Success 00:08:03.909 00:08:03.909 real 0m27.625s 00:08:03.909 user 0m33.227s 00:08:03.909 sys 0m3.834s 00:08:03.909 22:15:14 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.909 22:15:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.909 ************************************ 00:08:03.909 END TEST json_config 00:08:03.909 ************************************ 00:08:03.909 22:15:14 -- common/autotest_common.sh@1142 -- # return 0 00:08:03.909 22:15:14 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:03.909 22:15:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:03.909 22:15:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.909 22:15:14 -- common/autotest_common.sh@10 -- # set +x 00:08:03.909 ************************************ 00:08:03.909 START TEST json_config_extra_key 00:08:03.909 ************************************ 00:08:03.909 22:15:14 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:04.168 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:04.168 22:15:14 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:04.169 22:15:14 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:04.169 22:15:14 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:04.169 22:15:14 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:04.169 22:15:14 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:04.169 22:15:14 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:04.169 22:15:14 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:04.169 22:15:14 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:08:04.169 22:15:14 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:04.169 22:15:14 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:08:04.169 INFO: launching applications... 00:08:04.169 22:15:14 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=3388147 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:04.169 Waiting for target to run... 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 3388147 /var/tmp/spdk_tgt.sock 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 3388147 ']' 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:04.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:04.169 22:15:14 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:04.169 22:15:14 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:04.169 [2024-07-12 22:15:14.341161] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:04.169 [2024-07-12 22:15:14.341235] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3388147 ] 00:08:04.428 [2024-07-12 22:15:14.715749] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.687 [2024-07-12 22:15:14.807536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.946 22:15:15 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:04.946 22:15:15 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:08:04.946 00:08:04.946 22:15:15 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:08:04.946 INFO: shutting down applications... 00:08:04.946 22:15:15 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 3388147 ]] 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 3388147 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3388147 00:08:04.946 22:15:15 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3388147 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@43 -- # break 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:05.515 22:15:15 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:05.515 SPDK target shutdown done 00:08:05.515 22:15:15 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:08:05.515 Success 00:08:05.515 00:08:05.515 real 0m1.572s 00:08:05.515 user 0m1.258s 00:08:05.515 sys 0m0.495s 00:08:05.515 22:15:15 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.515 22:15:15 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:05.515 ************************************ 00:08:05.515 END TEST json_config_extra_key 00:08:05.515 ************************************ 00:08:05.515 22:15:15 -- common/autotest_common.sh@1142 -- # return 0 00:08:05.515 22:15:15 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:05.515 22:15:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:05.516 22:15:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.516 22:15:15 -- common/autotest_common.sh@10 -- # set +x 00:08:05.516 ************************************ 00:08:05.516 START TEST alias_rpc 00:08:05.516 ************************************ 00:08:05.516 22:15:15 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:05.775 * Looking for test storage... 00:08:05.775 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:08:05.775 22:15:15 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:05.775 22:15:15 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3388371 00:08:05.775 22:15:15 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3388371 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 3388371 ']' 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:05.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:05.775 22:15:15 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:05.775 22:15:15 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:05.775 [2024-07-12 22:15:16.011412] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:05.775 [2024-07-12 22:15:16.011487] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3388371 ] 00:08:06.035 [2024-07-12 22:15:16.139803] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.035 [2024-07-12 22:15:16.243216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.604 22:15:16 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:06.604 22:15:16 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:06.604 22:15:16 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:08:06.863 22:15:17 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3388371 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 3388371 ']' 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 3388371 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3388371 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3388371' 00:08:06.863 killing process with pid 3388371 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@967 -- # kill 3388371 00:08:06.863 22:15:17 alias_rpc -- common/autotest_common.sh@972 -- # wait 3388371 00:08:07.430 00:08:07.430 real 0m1.649s 00:08:07.430 user 0m1.670s 00:08:07.430 sys 0m0.564s 00:08:07.430 22:15:17 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.430 22:15:17 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:07.430 ************************************ 00:08:07.430 END TEST alias_rpc 00:08:07.430 ************************************ 00:08:07.430 22:15:17 -- common/autotest_common.sh@1142 -- # return 0 00:08:07.430 22:15:17 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:08:07.430 22:15:17 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:07.430 22:15:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:07.430 22:15:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.430 22:15:17 -- common/autotest_common.sh@10 -- # set +x 00:08:07.430 ************************************ 00:08:07.430 START TEST spdkcli_tcp 00:08:07.430 ************************************ 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:07.430 * Looking for test storage... 00:08:07.430 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3388607 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 3388607 00:08:07.430 22:15:17 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 3388607 ']' 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:07.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:07.430 22:15:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:07.430 [2024-07-12 22:15:17.740194] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:07.430 [2024-07-12 22:15:17.740265] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3388607 ] 00:08:07.688 [2024-07-12 22:15:17.869264] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:07.688 [2024-07-12 22:15:17.974283] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:07.688 [2024-07-12 22:15:17.974288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.624 22:15:18 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:08.624 22:15:18 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:08:08.624 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=3388786 00:08:08.624 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:08:08.624 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:08:08.624 [ 00:08:08.625 "bdev_malloc_delete", 00:08:08.625 "bdev_malloc_create", 00:08:08.625 "bdev_null_resize", 00:08:08.625 "bdev_null_delete", 00:08:08.625 "bdev_null_create", 00:08:08.625 "bdev_nvme_cuse_unregister", 00:08:08.625 "bdev_nvme_cuse_register", 00:08:08.625 "bdev_opal_new_user", 00:08:08.625 "bdev_opal_set_lock_state", 00:08:08.625 "bdev_opal_delete", 00:08:08.625 "bdev_opal_get_info", 00:08:08.625 "bdev_opal_create", 00:08:08.625 "bdev_nvme_opal_revert", 00:08:08.625 "bdev_nvme_opal_init", 00:08:08.625 "bdev_nvme_send_cmd", 00:08:08.625 "bdev_nvme_get_path_iostat", 00:08:08.625 "bdev_nvme_get_mdns_discovery_info", 00:08:08.625 "bdev_nvme_stop_mdns_discovery", 00:08:08.625 "bdev_nvme_start_mdns_discovery", 00:08:08.625 "bdev_nvme_set_multipath_policy", 00:08:08.625 "bdev_nvme_set_preferred_path", 00:08:08.625 "bdev_nvme_get_io_paths", 00:08:08.625 "bdev_nvme_remove_error_injection", 00:08:08.625 "bdev_nvme_add_error_injection", 00:08:08.625 "bdev_nvme_get_discovery_info", 00:08:08.625 "bdev_nvme_stop_discovery", 00:08:08.625 "bdev_nvme_start_discovery", 00:08:08.625 "bdev_nvme_get_controller_health_info", 00:08:08.625 "bdev_nvme_disable_controller", 00:08:08.625 "bdev_nvme_enable_controller", 00:08:08.625 "bdev_nvme_reset_controller", 00:08:08.625 "bdev_nvme_get_transport_statistics", 00:08:08.625 "bdev_nvme_apply_firmware", 00:08:08.625 "bdev_nvme_detach_controller", 00:08:08.625 "bdev_nvme_get_controllers", 00:08:08.625 "bdev_nvme_attach_controller", 00:08:08.625 "bdev_nvme_set_hotplug", 00:08:08.625 "bdev_nvme_set_options", 00:08:08.625 "bdev_passthru_delete", 00:08:08.625 "bdev_passthru_create", 00:08:08.625 "bdev_lvol_set_parent_bdev", 00:08:08.625 "bdev_lvol_set_parent", 00:08:08.625 "bdev_lvol_check_shallow_copy", 00:08:08.625 "bdev_lvol_start_shallow_copy", 00:08:08.625 "bdev_lvol_grow_lvstore", 00:08:08.625 "bdev_lvol_get_lvols", 00:08:08.625 "bdev_lvol_get_lvstores", 00:08:08.625 "bdev_lvol_delete", 00:08:08.625 "bdev_lvol_set_read_only", 00:08:08.625 "bdev_lvol_resize", 00:08:08.625 "bdev_lvol_decouple_parent", 00:08:08.625 "bdev_lvol_inflate", 00:08:08.625 "bdev_lvol_rename", 00:08:08.625 "bdev_lvol_clone_bdev", 00:08:08.625 "bdev_lvol_clone", 00:08:08.625 "bdev_lvol_snapshot", 00:08:08.625 "bdev_lvol_create", 00:08:08.625 "bdev_lvol_delete_lvstore", 00:08:08.625 "bdev_lvol_rename_lvstore", 00:08:08.625 "bdev_lvol_create_lvstore", 00:08:08.625 "bdev_raid_set_options", 00:08:08.625 "bdev_raid_remove_base_bdev", 00:08:08.625 "bdev_raid_add_base_bdev", 00:08:08.625 "bdev_raid_delete", 00:08:08.625 "bdev_raid_create", 00:08:08.625 "bdev_raid_get_bdevs", 00:08:08.625 "bdev_error_inject_error", 00:08:08.625 "bdev_error_delete", 00:08:08.625 "bdev_error_create", 00:08:08.625 "bdev_split_delete", 00:08:08.625 "bdev_split_create", 00:08:08.625 "bdev_delay_delete", 00:08:08.625 "bdev_delay_create", 00:08:08.625 "bdev_delay_update_latency", 00:08:08.625 "bdev_zone_block_delete", 00:08:08.625 "bdev_zone_block_create", 00:08:08.625 "blobfs_create", 00:08:08.625 "blobfs_detect", 00:08:08.625 "blobfs_set_cache_size", 00:08:08.625 "bdev_crypto_delete", 00:08:08.625 "bdev_crypto_create", 00:08:08.625 "bdev_compress_delete", 00:08:08.625 "bdev_compress_create", 00:08:08.625 "bdev_compress_get_orphans", 00:08:08.625 "bdev_aio_delete", 00:08:08.625 "bdev_aio_rescan", 00:08:08.625 "bdev_aio_create", 00:08:08.625 "bdev_ftl_set_property", 00:08:08.625 "bdev_ftl_get_properties", 00:08:08.625 "bdev_ftl_get_stats", 00:08:08.625 "bdev_ftl_unmap", 00:08:08.625 "bdev_ftl_unload", 00:08:08.625 "bdev_ftl_delete", 00:08:08.625 "bdev_ftl_load", 00:08:08.625 "bdev_ftl_create", 00:08:08.625 "bdev_virtio_attach_controller", 00:08:08.625 "bdev_virtio_scsi_get_devices", 00:08:08.625 "bdev_virtio_detach_controller", 00:08:08.625 "bdev_virtio_blk_set_hotplug", 00:08:08.625 "bdev_iscsi_delete", 00:08:08.625 "bdev_iscsi_create", 00:08:08.625 "bdev_iscsi_set_options", 00:08:08.625 "accel_error_inject_error", 00:08:08.625 "ioat_scan_accel_module", 00:08:08.625 "dsa_scan_accel_module", 00:08:08.625 "iaa_scan_accel_module", 00:08:08.625 "dpdk_cryptodev_get_driver", 00:08:08.625 "dpdk_cryptodev_set_driver", 00:08:08.625 "dpdk_cryptodev_scan_accel_module", 00:08:08.625 "compressdev_scan_accel_module", 00:08:08.625 "keyring_file_remove_key", 00:08:08.625 "keyring_file_add_key", 00:08:08.625 "keyring_linux_set_options", 00:08:08.625 "iscsi_get_histogram", 00:08:08.625 "iscsi_enable_histogram", 00:08:08.625 "iscsi_set_options", 00:08:08.625 "iscsi_get_auth_groups", 00:08:08.625 "iscsi_auth_group_remove_secret", 00:08:08.625 "iscsi_auth_group_add_secret", 00:08:08.625 "iscsi_delete_auth_group", 00:08:08.625 "iscsi_create_auth_group", 00:08:08.625 "iscsi_set_discovery_auth", 00:08:08.625 "iscsi_get_options", 00:08:08.625 "iscsi_target_node_request_logout", 00:08:08.625 "iscsi_target_node_set_redirect", 00:08:08.625 "iscsi_target_node_set_auth", 00:08:08.625 "iscsi_target_node_add_lun", 00:08:08.625 "iscsi_get_stats", 00:08:08.625 "iscsi_get_connections", 00:08:08.625 "iscsi_portal_group_set_auth", 00:08:08.625 "iscsi_start_portal_group", 00:08:08.625 "iscsi_delete_portal_group", 00:08:08.625 "iscsi_create_portal_group", 00:08:08.625 "iscsi_get_portal_groups", 00:08:08.625 "iscsi_delete_target_node", 00:08:08.625 "iscsi_target_node_remove_pg_ig_maps", 00:08:08.625 "iscsi_target_node_add_pg_ig_maps", 00:08:08.625 "iscsi_create_target_node", 00:08:08.625 "iscsi_get_target_nodes", 00:08:08.625 "iscsi_delete_initiator_group", 00:08:08.625 "iscsi_initiator_group_remove_initiators", 00:08:08.625 "iscsi_initiator_group_add_initiators", 00:08:08.625 "iscsi_create_initiator_group", 00:08:08.625 "iscsi_get_initiator_groups", 00:08:08.625 "nvmf_set_crdt", 00:08:08.625 "nvmf_set_config", 00:08:08.625 "nvmf_set_max_subsystems", 00:08:08.625 "nvmf_stop_mdns_prr", 00:08:08.625 "nvmf_publish_mdns_prr", 00:08:08.625 "nvmf_subsystem_get_listeners", 00:08:08.625 "nvmf_subsystem_get_qpairs", 00:08:08.625 "nvmf_subsystem_get_controllers", 00:08:08.625 "nvmf_get_stats", 00:08:08.625 "nvmf_get_transports", 00:08:08.625 "nvmf_create_transport", 00:08:08.625 "nvmf_get_targets", 00:08:08.625 "nvmf_delete_target", 00:08:08.625 "nvmf_create_target", 00:08:08.625 "nvmf_subsystem_allow_any_host", 00:08:08.625 "nvmf_subsystem_remove_host", 00:08:08.625 "nvmf_subsystem_add_host", 00:08:08.625 "nvmf_ns_remove_host", 00:08:08.625 "nvmf_ns_add_host", 00:08:08.625 "nvmf_subsystem_remove_ns", 00:08:08.625 "nvmf_subsystem_add_ns", 00:08:08.625 "nvmf_subsystem_listener_set_ana_state", 00:08:08.625 "nvmf_discovery_get_referrals", 00:08:08.625 "nvmf_discovery_remove_referral", 00:08:08.625 "nvmf_discovery_add_referral", 00:08:08.625 "nvmf_subsystem_remove_listener", 00:08:08.625 "nvmf_subsystem_add_listener", 00:08:08.625 "nvmf_delete_subsystem", 00:08:08.625 "nvmf_create_subsystem", 00:08:08.625 "nvmf_get_subsystems", 00:08:08.625 "env_dpdk_get_mem_stats", 00:08:08.625 "nbd_get_disks", 00:08:08.625 "nbd_stop_disk", 00:08:08.625 "nbd_start_disk", 00:08:08.625 "ublk_recover_disk", 00:08:08.625 "ublk_get_disks", 00:08:08.625 "ublk_stop_disk", 00:08:08.625 "ublk_start_disk", 00:08:08.625 "ublk_destroy_target", 00:08:08.625 "ublk_create_target", 00:08:08.625 "virtio_blk_create_transport", 00:08:08.625 "virtio_blk_get_transports", 00:08:08.625 "vhost_controller_set_coalescing", 00:08:08.625 "vhost_get_controllers", 00:08:08.625 "vhost_delete_controller", 00:08:08.625 "vhost_create_blk_controller", 00:08:08.625 "vhost_scsi_controller_remove_target", 00:08:08.625 "vhost_scsi_controller_add_target", 00:08:08.625 "vhost_start_scsi_controller", 00:08:08.625 "vhost_create_scsi_controller", 00:08:08.625 "thread_set_cpumask", 00:08:08.625 "framework_get_governor", 00:08:08.625 "framework_get_scheduler", 00:08:08.625 "framework_set_scheduler", 00:08:08.625 "framework_get_reactors", 00:08:08.625 "thread_get_io_channels", 00:08:08.625 "thread_get_pollers", 00:08:08.625 "thread_get_stats", 00:08:08.625 "framework_monitor_context_switch", 00:08:08.625 "spdk_kill_instance", 00:08:08.625 "log_enable_timestamps", 00:08:08.625 "log_get_flags", 00:08:08.625 "log_clear_flag", 00:08:08.625 "log_set_flag", 00:08:08.625 "log_get_level", 00:08:08.625 "log_set_level", 00:08:08.625 "log_get_print_level", 00:08:08.625 "log_set_print_level", 00:08:08.625 "framework_enable_cpumask_locks", 00:08:08.625 "framework_disable_cpumask_locks", 00:08:08.625 "framework_wait_init", 00:08:08.625 "framework_start_init", 00:08:08.625 "scsi_get_devices", 00:08:08.625 "bdev_get_histogram", 00:08:08.625 "bdev_enable_histogram", 00:08:08.625 "bdev_set_qos_limit", 00:08:08.625 "bdev_set_qd_sampling_period", 00:08:08.625 "bdev_get_bdevs", 00:08:08.625 "bdev_reset_iostat", 00:08:08.625 "bdev_get_iostat", 00:08:08.625 "bdev_examine", 00:08:08.625 "bdev_wait_for_examine", 00:08:08.625 "bdev_set_options", 00:08:08.625 "notify_get_notifications", 00:08:08.625 "notify_get_types", 00:08:08.625 "accel_get_stats", 00:08:08.625 "accel_set_options", 00:08:08.625 "accel_set_driver", 00:08:08.625 "accel_crypto_key_destroy", 00:08:08.625 "accel_crypto_keys_get", 00:08:08.625 "accel_crypto_key_create", 00:08:08.625 "accel_assign_opc", 00:08:08.625 "accel_get_module_info", 00:08:08.625 "accel_get_opc_assignments", 00:08:08.625 "vmd_rescan", 00:08:08.625 "vmd_remove_device", 00:08:08.625 "vmd_enable", 00:08:08.625 "sock_get_default_impl", 00:08:08.625 "sock_set_default_impl", 00:08:08.625 "sock_impl_set_options", 00:08:08.625 "sock_impl_get_options", 00:08:08.625 "iobuf_get_stats", 00:08:08.625 "iobuf_set_options", 00:08:08.625 "framework_get_pci_devices", 00:08:08.625 "framework_get_config", 00:08:08.626 "framework_get_subsystems", 00:08:08.626 "trace_get_info", 00:08:08.626 "trace_get_tpoint_group_mask", 00:08:08.626 "trace_disable_tpoint_group", 00:08:08.626 "trace_enable_tpoint_group", 00:08:08.626 "trace_clear_tpoint_mask", 00:08:08.626 "trace_set_tpoint_mask", 00:08:08.626 "keyring_get_keys", 00:08:08.626 "spdk_get_version", 00:08:08.626 "rpc_get_methods" 00:08:08.626 ] 00:08:08.626 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:08:08.626 22:15:18 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:08.626 22:15:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:08.884 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:08:08.884 22:15:18 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 3388607 00:08:08.884 22:15:18 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 3388607 ']' 00:08:08.884 22:15:18 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 3388607 00:08:08.884 22:15:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:08:08.884 22:15:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:08.884 22:15:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3388607 00:08:08.884 22:15:19 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:08.884 22:15:19 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:08.884 22:15:19 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3388607' 00:08:08.884 killing process with pid 3388607 00:08:08.884 22:15:19 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 3388607 00:08:08.884 22:15:19 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 3388607 00:08:09.143 00:08:09.143 real 0m1.858s 00:08:09.143 user 0m3.392s 00:08:09.143 sys 0m0.619s 00:08:09.143 22:15:19 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:09.143 22:15:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:09.143 ************************************ 00:08:09.143 END TEST spdkcli_tcp 00:08:09.143 ************************************ 00:08:09.143 22:15:19 -- common/autotest_common.sh@1142 -- # return 0 00:08:09.402 22:15:19 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:09.402 22:15:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:09.402 22:15:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.402 22:15:19 -- common/autotest_common.sh@10 -- # set +x 00:08:09.402 ************************************ 00:08:09.402 START TEST dpdk_mem_utility 00:08:09.402 ************************************ 00:08:09.402 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:09.402 * Looking for test storage... 00:08:09.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:08:09.403 22:15:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:09.403 22:15:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3389019 00:08:09.403 22:15:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3389019 00:08:09.403 22:15:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 3389019 ']' 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:09.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:09.403 22:15:19 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:09.403 [2024-07-12 22:15:19.655051] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:09.403 [2024-07-12 22:15:19.655111] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3389019 ] 00:08:09.662 [2024-07-12 22:15:19.764729] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.662 [2024-07-12 22:15:19.862764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.229 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:10.229 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:08:10.229 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:08:10.229 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:08:10.229 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:10.229 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:10.229 { 00:08:10.229 "filename": "/tmp/spdk_mem_dump.txt" 00:08:10.229 } 00:08:10.229 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:10.229 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:10.491 DPDK memory size 816.000000 MiB in 2 heap(s) 00:08:10.491 2 heaps totaling size 816.000000 MiB 00:08:10.491 size: 814.000000 MiB heap id: 0 00:08:10.491 size: 2.000000 MiB heap id: 1 00:08:10.491 end heaps---------- 00:08:10.491 8 mempools totaling size 598.116089 MiB 00:08:10.491 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:08:10.491 size: 158.602051 MiB name: PDU_data_out_Pool 00:08:10.491 size: 84.521057 MiB name: bdev_io_3389019 00:08:10.491 size: 51.011292 MiB name: evtpool_3389019 00:08:10.491 size: 50.003479 MiB name: msgpool_3389019 00:08:10.491 size: 21.763794 MiB name: PDU_Pool 00:08:10.491 size: 19.513306 MiB name: SCSI_TASK_Pool 00:08:10.491 size: 0.026123 MiB name: Session_Pool 00:08:10.491 end mempools------- 00:08:10.491 201 memzones totaling size 4.176453 MiB 00:08:10.491 size: 1.000366 MiB name: RG_ring_0_3389019 00:08:10.491 size: 1.000366 MiB name: RG_ring_1_3389019 00:08:10.491 size: 1.000366 MiB name: RG_ring_4_3389019 00:08:10.491 size: 1.000366 MiB name: RG_ring_5_3389019 00:08:10.491 size: 0.125366 MiB name: RG_ring_2_3389019 00:08:10.491 size: 0.015991 MiB name: RG_ring_3_3389019 00:08:10.491 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:10.491 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:10.491 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:10.491 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:10.491 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:10.491 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:10.492 size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:10.492 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_0 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_1 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_2 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_3 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_4 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_5 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_6 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_7 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_8 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_9 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_10 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_11 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_12 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_13 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_14 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_15 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_16 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_17 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_18 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_19 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_20 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_21 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_22 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_23 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_24 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_25 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_26 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_27 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_28 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_29 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_30 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_31 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_32 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_33 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_34 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_35 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_36 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_37 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_38 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_39 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_40 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_41 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_42 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_43 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_44 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_45 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:10.492 size: 0.000122 MiB name: rte_compressdev_data_46 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:10.492 size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:10.493 size: 0.000122 MiB name: rte_compressdev_data_47 00:08:10.493 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:10.493 end memzones------- 00:08:10.493 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:08:10.493 heap id: 0 total size: 814.000000 MiB number of busy elements: 533 number of free elements: 14 00:08:10.493 list of free elements. size: 11.812256 MiB 00:08:10.493 element at address: 0x200000400000 with size: 1.999512 MiB 00:08:10.493 element at address: 0x200018e00000 with size: 0.999878 MiB 00:08:10.493 element at address: 0x200019000000 with size: 0.999878 MiB 00:08:10.493 element at address: 0x200003e00000 with size: 0.996460 MiB 00:08:10.493 element at address: 0x200031c00000 with size: 0.994446 MiB 00:08:10.493 element at address: 0x200013800000 with size: 0.978882 MiB 00:08:10.493 element at address: 0x200007000000 with size: 0.959839 MiB 00:08:10.493 element at address: 0x200019200000 with size: 0.937256 MiB 00:08:10.493 element at address: 0x20001aa00000 with size: 0.581421 MiB 00:08:10.493 element at address: 0x200003a00000 with size: 0.498535 MiB 00:08:10.493 element at address: 0x20000b200000 with size: 0.491272 MiB 00:08:10.493 element at address: 0x200000800000 with size: 0.486511 MiB 00:08:10.493 element at address: 0x200019400000 with size: 0.485840 MiB 00:08:10.493 element at address: 0x200027e00000 with size: 0.402527 MiB 00:08:10.493 list of standard malloc elements. size: 199.879456 MiB 00:08:10.493 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:08:10.493 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:08:10.493 element at address: 0x200018efff80 with size: 1.000122 MiB 00:08:10.493 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:08:10.493 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:08:10.493 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:08:10.493 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:08:10.493 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:08:10.493 element at address: 0x200000330b40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000337640 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000033e140 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000344c40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000034b740 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000352240 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000358d40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000035f840 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000366880 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000036a340 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000036de00 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000375380 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000378e40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000037c900 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000383e80 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000387940 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000038b400 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000392980 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000396440 with size: 0.004395 MiB 00:08:10.493 element at address: 0x200000399f00 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:08:10.493 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:08:10.493 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000333040 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000335540 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000339b40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000033c040 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000340640 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000342b40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000347140 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000349640 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000350140 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000354740 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000356c40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000035b240 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000035d740 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000361d40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000364780 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000365800 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000368240 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000370840 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000373280 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000374300 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000376d40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000037a800 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000037b880 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000037f340 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000381d80 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000382e00 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000385840 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000389300 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000038a380 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000038de40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000390880 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000391900 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000394340 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000397e00 with size: 0.004028 MiB 00:08:10.493 element at address: 0x200000398e80 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000039c940 with size: 0.004028 MiB 00:08:10.493 element at address: 0x20000039f380 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:08:10.493 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:08:10.494 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:08:10.494 element at address: 0x200000204f80 with size: 0.000305 MiB 00:08:10.494 element at address: 0x200000200000 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200180 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200240 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200300 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200480 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200540 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200600 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200780 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200840 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200900 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200a80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200b40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200c00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200d80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200e40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200f00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201080 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201140 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201200 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201380 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201440 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201500 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201680 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201740 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201800 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201980 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201a40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201b00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201c80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201d40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201e00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000201f80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202040 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202100 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202280 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202340 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202400 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202580 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202640 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202700 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202880 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202940 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202a00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202b80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202c40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202d00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202e80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000202f40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203000 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203180 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203240 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203300 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203480 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203540 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203600 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203780 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203840 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203900 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203a80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203b40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203c00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203d80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203e40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203f00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204080 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204140 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204200 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204380 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204440 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204500 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204680 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204740 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204800 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204980 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204a40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204b00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204c80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204d40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204e00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000204ec0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205180 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205240 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205300 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205480 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205540 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205600 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205780 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205840 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205900 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205a80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205b40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205c00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205d80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205e40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205f00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000206080 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000206140 with size: 0.000183 MiB 00:08:10.494 element at address: 0x200000206200 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000020a780 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022af80 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b040 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b100 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b280 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b340 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b400 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b580 with size: 0.000183 MiB 00:08:10.494 element at address: 0x20000022b640 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022b700 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022be40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c080 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c140 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c200 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c380 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c440 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000022c500 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000032e700 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000331d40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000338840 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000033f340 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000345e40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000034c940 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000353440 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000359f40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000360a40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000364180 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000364240 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000364400 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000367a80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000367c40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000367d00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036b540 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036b700 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036b980 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036f000 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036f280 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000036f440 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000372c80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000372d40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000372f00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000376580 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000376740 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000376800 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037a040 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037a200 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037a480 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037db00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000037df40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000381780 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000381840 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000381a00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000385080 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000385240 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000385300 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000388b40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000388d00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000388f80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000038c600 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000038c880 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000390280 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000390340 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000390500 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000393b80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000393d40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000393e00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000397640 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000397800 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x200000397a80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039b100 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039b380 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039b540 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x20000039f000 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:08:10.495 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087c980 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:08:10.496 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:08:10.496 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e670c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e67180 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6dd80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:08:10.496 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:08:10.496 list of memzone associated elements. size: 602.308289 MiB 00:08:10.496 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:08:10.496 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:08:10.496 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:08:10.496 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:08:10.496 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:08:10.496 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3389019_0 00:08:10.496 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:08:10.496 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3389019_0 00:08:10.496 element at address: 0x200003fff380 with size: 48.003052 MiB 00:08:10.496 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3389019_0 00:08:10.496 element at address: 0x2000195be940 with size: 20.255554 MiB 00:08:10.496 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:08:10.496 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:08:10.496 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:08:10.496 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:08:10.496 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3389019 00:08:10.496 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:08:10.496 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3389019 00:08:10.496 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:08:10.496 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3389019 00:08:10.496 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:08:10.496 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:08:10.496 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:08:10.496 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:08:10.496 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:08:10.496 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:08:10.496 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:08:10.496 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:08:10.496 element at address: 0x200003eff180 with size: 1.000488 MiB 00:08:10.496 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3389019 00:08:10.496 element at address: 0x200003affc00 with size: 1.000488 MiB 00:08:10.496 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3389019 00:08:10.496 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:08:10.496 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3389019 00:08:10.496 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:08:10.496 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3389019 00:08:10.496 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:08:10.496 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3389019 00:08:10.496 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:08:10.496 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:08:10.496 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:08:10.496 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:08:10.496 element at address: 0x20001947c600 with size: 0.250488 MiB 00:08:10.496 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:08:10.496 element at address: 0x20000020a840 with size: 0.125488 MiB 00:08:10.496 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3389019 00:08:10.496 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:08:10.496 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:08:10.496 element at address: 0x200027e67240 with size: 0.023743 MiB 00:08:10.496 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:08:10.496 element at address: 0x200000206580 with size: 0.016113 MiB 00:08:10.496 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3389019 00:08:10.496 element at address: 0x200027e6d380 with size: 0.002441 MiB 00:08:10.496 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:08:10.496 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:08:10.496 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:10.496 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:08:10.496 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:10.497 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:10.497 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:10.497 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:10.497 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:10.497 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:10.497 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:10.497 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:10.497 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:10.497 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:10.497 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:10.497 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:10.497 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:10.497 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:10.497 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:10.497 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:10.497 element at address: 0x20000039b700 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:10.497 element at address: 0x200000397c40 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:10.497 element at address: 0x200000394180 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:10.497 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:10.497 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:10.497 element at address: 0x200000389140 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:10.497 element at address: 0x200000385680 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:10.497 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:10.497 element at address: 0x20000037e100 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:10.497 element at address: 0x20000037a640 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:10.497 element at address: 0x200000376b80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:10.497 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:10.497 element at address: 0x20000036f600 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:10.497 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:10.497 element at address: 0x200000368080 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:10.497 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:10.497 element at address: 0x200000360b00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:10.497 element at address: 0x20000035d580 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:10.497 element at address: 0x20000035a000 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:10.497 element at address: 0x200000356a80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:10.497 element at address: 0x200000353500 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:10.497 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:10.497 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:10.497 element at address: 0x200000349480 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:10.497 element at address: 0x200000345f00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:10.497 element at address: 0x200000342980 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:10.497 element at address: 0x20000033f400 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:10.497 element at address: 0x20000033be80 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:10.497 element at address: 0x200000338900 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:10.497 element at address: 0x200000335380 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:10.497 element at address: 0x200000331e00 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:10.497 element at address: 0x20000032e880 with size: 0.000427 MiB 00:08:10.497 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:10.497 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:08:10.497 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:10.497 element at address: 0x20000022b880 with size: 0.000305 MiB 00:08:10.497 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3389019 00:08:10.497 element at address: 0x200000206380 with size: 0.000305 MiB 00:08:10.497 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3389019 00:08:10.497 element at address: 0x200027e6de40 with size: 0.000305 MiB 00:08:10.497 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:08:10.497 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:10.497 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:10.497 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:08:10.497 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:10.497 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:10.497 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:08:10.497 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:10.497 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:10.497 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:08:10.497 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:10.497 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:10.497 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:08:10.497 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:10.497 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:10.497 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:08:10.497 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:10.497 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:10.497 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:08:10.497 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:10.497 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:10.497 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:08:10.497 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:10.497 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:10.497 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:08:10.497 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:10.497 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:08:10.497 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:10.498 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:08:10.498 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:10.498 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:10.498 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:08:10.498 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:10.498 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:10.498 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:08:10.498 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:10.498 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:10.498 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:08:10.498 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:10.498 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:10.498 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:08:10.498 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:10.498 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:10.498 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:08:10.498 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:10.498 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:10.498 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:08:10.498 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:10.498 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:10.498 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:08:10.498 element at address: 0x20000039b600 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:10.498 element at address: 0x20000039b440 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:10.498 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:08:10.498 element at address: 0x200000397b40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:10.498 element at address: 0x200000397980 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:10.498 element at address: 0x200000397700 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:08:10.498 element at address: 0x200000394080 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:10.498 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:10.498 element at address: 0x200000393c40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:08:10.498 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:10.498 element at address: 0x200000390400 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:10.498 element at address: 0x200000390180 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:08:10.498 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:10.498 element at address: 0x20000038c940 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:10.498 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:08:10.498 element at address: 0x200000389040 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:10.498 element at address: 0x200000388e80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:10.498 element at address: 0x200000388c00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:08:10.498 element at address: 0x200000385580 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:10.498 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:10.498 element at address: 0x200000385140 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:08:10.498 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:10.498 element at address: 0x200000381900 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:10.498 element at address: 0x200000381680 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:08:10.498 element at address: 0x20000037e000 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:10.498 element at address: 0x20000037de40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:10.498 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:08:10.498 element at address: 0x20000037a540 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:10.498 element at address: 0x20000037a380 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:10.498 element at address: 0x20000037a100 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:08:10.498 element at address: 0x200000376a80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:10.498 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:10.498 element at address: 0x200000376640 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:08:10.498 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:10.498 element at address: 0x200000372e00 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:10.498 element at address: 0x200000372b80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:08:10.498 element at address: 0x20000036f500 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:10.498 element at address: 0x20000036f340 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:10.498 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:08:10.498 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:10.498 element at address: 0x20000036b880 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:10.498 element at address: 0x20000036b600 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:08:10.498 element at address: 0x200000367f80 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:10.498 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:10.498 element at address: 0x200000367b40 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:08:10.498 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:10.498 element at address: 0x200000364300 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:10.498 element at address: 0x200000364080 with size: 0.000244 MiB 00:08:10.498 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:08:10.498 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:08:10.498 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:10.498 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:08:10.498 22:15:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3389019 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 3389019 ']' 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 3389019 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3389019 00:08:10.498 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:10.499 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:10.499 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3389019' 00:08:10.499 killing process with pid 3389019 00:08:10.499 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 3389019 00:08:10.499 22:15:20 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 3389019 00:08:11.068 00:08:11.068 real 0m1.632s 00:08:11.068 user 0m1.740s 00:08:11.068 sys 0m0.501s 00:08:11.068 22:15:21 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.068 22:15:21 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:11.068 ************************************ 00:08:11.068 END TEST dpdk_mem_utility 00:08:11.068 ************************************ 00:08:11.068 22:15:21 -- common/autotest_common.sh@1142 -- # return 0 00:08:11.068 22:15:21 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:11.068 22:15:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:11.068 22:15:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.068 22:15:21 -- common/autotest_common.sh@10 -- # set +x 00:08:11.068 ************************************ 00:08:11.068 START TEST event 00:08:11.068 ************************************ 00:08:11.068 22:15:21 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:11.068 * Looking for test storage... 00:08:11.068 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:08:11.068 22:15:21 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:11.068 22:15:21 event -- bdev/nbd_common.sh@6 -- # set -e 00:08:11.068 22:15:21 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:11.068 22:15:21 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:11.068 22:15:21 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.068 22:15:21 event -- common/autotest_common.sh@10 -- # set +x 00:08:11.068 ************************************ 00:08:11.068 START TEST event_perf 00:08:11.068 ************************************ 00:08:11.068 22:15:21 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:11.327 Running I/O for 1 seconds...[2024-07-12 22:15:21.404136] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:11.327 [2024-07-12 22:15:21.404201] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3389260 ] 00:08:11.327 [2024-07-12 22:15:21.534371] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:11.327 [2024-07-12 22:15:21.636227] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:11.327 [2024-07-12 22:15:21.636310] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:11.327 [2024-07-12 22:15:21.636387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:11.327 [2024-07-12 22:15:21.636396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.747 Running I/O for 1 seconds... 00:08:12.747 lcore 0: 179400 00:08:12.747 lcore 1: 179398 00:08:12.747 lcore 2: 179397 00:08:12.747 lcore 3: 179399 00:08:12.747 done. 00:08:12.747 00:08:12.747 real 0m1.353s 00:08:12.747 user 0m4.212s 00:08:12.747 sys 0m0.135s 00:08:12.747 22:15:22 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.747 22:15:22 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:08:12.747 ************************************ 00:08:12.747 END TEST event_perf 00:08:12.747 ************************************ 00:08:12.747 22:15:22 event -- common/autotest_common.sh@1142 -- # return 0 00:08:12.747 22:15:22 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:12.747 22:15:22 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:12.747 22:15:22 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.747 22:15:22 event -- common/autotest_common.sh@10 -- # set +x 00:08:12.747 ************************************ 00:08:12.747 START TEST event_reactor 00:08:12.747 ************************************ 00:08:12.747 22:15:22 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:12.747 [2024-07-12 22:15:22.840430] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:12.747 [2024-07-12 22:15:22.840495] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3389459 ] 00:08:12.747 [2024-07-12 22:15:22.971964] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.005 [2024-07-12 22:15:23.071605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.940 test_start 00:08:13.940 oneshot 00:08:13.940 tick 100 00:08:13.940 tick 100 00:08:13.940 tick 250 00:08:13.940 tick 100 00:08:13.940 tick 100 00:08:13.940 tick 100 00:08:13.940 tick 250 00:08:13.940 tick 500 00:08:13.940 tick 100 00:08:13.940 tick 100 00:08:13.940 tick 250 00:08:13.940 tick 100 00:08:13.940 tick 100 00:08:13.940 test_end 00:08:13.940 00:08:13.940 real 0m1.351s 00:08:13.940 user 0m1.207s 00:08:13.940 sys 0m0.137s 00:08:13.940 22:15:24 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.940 22:15:24 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:08:13.940 ************************************ 00:08:13.940 END TEST event_reactor 00:08:13.940 ************************************ 00:08:13.940 22:15:24 event -- common/autotest_common.sh@1142 -- # return 0 00:08:13.940 22:15:24 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:13.940 22:15:24 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:13.940 22:15:24 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.940 22:15:24 event -- common/autotest_common.sh@10 -- # set +x 00:08:13.940 ************************************ 00:08:13.940 START TEST event_reactor_perf 00:08:13.940 ************************************ 00:08:13.940 22:15:24 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:13.940 [2024-07-12 22:15:24.262261] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:13.940 [2024-07-12 22:15:24.262326] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3389661 ] 00:08:14.199 [2024-07-12 22:15:24.391189] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.199 [2024-07-12 22:15:24.492122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.577 test_start 00:08:15.577 test_end 00:08:15.577 Performance: 328076 events per second 00:08:15.577 00:08:15.577 real 0m1.348s 00:08:15.577 user 0m1.211s 00:08:15.577 sys 0m0.130s 00:08:15.577 22:15:25 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.577 22:15:25 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:08:15.577 ************************************ 00:08:15.577 END TEST event_reactor_perf 00:08:15.577 ************************************ 00:08:15.577 22:15:25 event -- common/autotest_common.sh@1142 -- # return 0 00:08:15.577 22:15:25 event -- event/event.sh@49 -- # uname -s 00:08:15.577 22:15:25 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:08:15.577 22:15:25 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:15.577 22:15:25 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:15.577 22:15:25 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.577 22:15:25 event -- common/autotest_common.sh@10 -- # set +x 00:08:15.577 ************************************ 00:08:15.577 START TEST event_scheduler 00:08:15.577 ************************************ 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:15.577 * Looking for test storage... 00:08:15.577 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:08:15.577 22:15:25 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:08:15.577 22:15:25 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=3389885 00:08:15.577 22:15:25 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:08:15.577 22:15:25 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:08:15.577 22:15:25 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 3389885 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 3389885 ']' 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:15.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:15.577 22:15:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:15.577 [2024-07-12 22:15:25.828795] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:15.577 [2024-07-12 22:15:25.828867] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3389885 ] 00:08:15.836 [2024-07-12 22:15:25.930988] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:15.836 [2024-07-12 22:15:26.016085] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.836 [2024-07-12 22:15:26.016159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:15.836 [2024-07-12 22:15:26.016221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:15.836 [2024-07-12 22:15:26.016223] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:08:16.406 22:15:26 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:16.406 [2024-07-12 22:15:26.710834] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:08:16.406 [2024-07-12 22:15:26.710858] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:08:16.406 [2024-07-12 22:15:26.710870] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:08:16.406 [2024-07-12 22:15:26.710878] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:08:16.406 [2024-07-12 22:15:26.710885] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.406 22:15:26 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.406 22:15:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 [2024-07-12 22:15:26.802032] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:08:16.666 22:15:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:08:16.666 22:15:26 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:16.666 22:15:26 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 ************************************ 00:08:16.666 START TEST scheduler_create_thread 00:08:16.666 ************************************ 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 2 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 3 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 4 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 5 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 6 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 7 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 8 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 9 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 10 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:16.666 22:15:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:17.234 22:15:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:17.234 22:15:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:08:17.234 22:15:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:17.234 22:15:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:18.612 22:15:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:18.612 22:15:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:08:18.612 22:15:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:08:18.612 22:15:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:18.612 22:15:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:19.989 22:15:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:19.989 00:08:19.989 real 0m3.102s 00:08:19.989 user 0m0.023s 00:08:19.989 sys 0m0.008s 00:08:19.989 22:15:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.989 22:15:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:19.989 ************************************ 00:08:19.989 END TEST scheduler_create_thread 00:08:19.989 ************************************ 00:08:19.989 22:15:29 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:08:19.989 22:15:29 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:08:19.989 22:15:29 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 3389885 00:08:19.990 22:15:29 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 3389885 ']' 00:08:19.990 22:15:29 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 3389885 00:08:19.990 22:15:29 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:08:19.990 22:15:29 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:19.990 22:15:29 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3389885 00:08:19.990 22:15:30 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:08:19.990 22:15:30 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:08:19.990 22:15:30 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3389885' 00:08:19.990 killing process with pid 3389885 00:08:19.990 22:15:30 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 3389885 00:08:19.990 22:15:30 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 3389885 00:08:20.249 [2024-07-12 22:15:30.325371] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:08:20.249 00:08:20.249 real 0m4.909s 00:08:20.249 user 0m9.470s 00:08:20.249 sys 0m0.503s 00:08:20.249 22:15:30 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.249 22:15:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:20.249 ************************************ 00:08:20.249 END TEST event_scheduler 00:08:20.249 ************************************ 00:08:20.509 22:15:30 event -- common/autotest_common.sh@1142 -- # return 0 00:08:20.509 22:15:30 event -- event/event.sh@51 -- # modprobe -n nbd 00:08:20.509 22:15:30 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:08:20.509 22:15:30 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:20.509 22:15:30 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.509 22:15:30 event -- common/autotest_common.sh@10 -- # set +x 00:08:20.509 ************************************ 00:08:20.509 START TEST app_repeat 00:08:20.509 ************************************ 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@19 -- # repeat_pid=3390636 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3390636' 00:08:20.509 Process app_repeat pid: 3390636 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:08:20.509 spdk_app_start Round 0 00:08:20.509 22:15:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3390636 /var/tmp/spdk-nbd.sock 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3390636 ']' 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:20.509 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:20.509 22:15:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:20.509 [2024-07-12 22:15:30.707480] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:20.509 [2024-07-12 22:15:30.707548] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3390636 ] 00:08:20.768 [2024-07-12 22:15:30.842090] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:20.768 [2024-07-12 22:15:30.944464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:20.768 [2024-07-12 22:15:30.944470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.703 22:15:31 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:21.703 22:15:31 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:21.703 22:15:31 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:21.703 Malloc0 00:08:21.703 22:15:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:21.703 Malloc1 00:08:21.962 22:15:32 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:21.962 22:15:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:21.962 /dev/nbd0 00:08:22.220 22:15:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:22.220 22:15:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:22.220 1+0 records in 00:08:22.220 1+0 records out 00:08:22.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240407 s, 17.0 MB/s 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:22.220 22:15:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:22.220 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:22.220 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:22.220 22:15:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:22.480 /dev/nbd1 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:22.480 1+0 records in 00:08:22.480 1+0 records out 00:08:22.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026627 s, 15.4 MB/s 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:22.480 22:15:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:22.480 22:15:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:22.739 { 00:08:22.739 "nbd_device": "/dev/nbd0", 00:08:22.739 "bdev_name": "Malloc0" 00:08:22.739 }, 00:08:22.739 { 00:08:22.739 "nbd_device": "/dev/nbd1", 00:08:22.739 "bdev_name": "Malloc1" 00:08:22.739 } 00:08:22.739 ]' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:22.739 { 00:08:22.739 "nbd_device": "/dev/nbd0", 00:08:22.739 "bdev_name": "Malloc0" 00:08:22.739 }, 00:08:22.739 { 00:08:22.739 "nbd_device": "/dev/nbd1", 00:08:22.739 "bdev_name": "Malloc1" 00:08:22.739 } 00:08:22.739 ]' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:22.739 /dev/nbd1' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:22.739 /dev/nbd1' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:22.739 256+0 records in 00:08:22.739 256+0 records out 00:08:22.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102434 s, 102 MB/s 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:22.739 256+0 records in 00:08:22.739 256+0 records out 00:08:22.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0293456 s, 35.7 MB/s 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:22.739 256+0 records in 00:08:22.739 256+0 records out 00:08:22.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0245715 s, 42.7 MB/s 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.739 22:15:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.997 22:15:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:23.255 22:15:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:23.514 22:15:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:23.514 22:15:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:23.773 22:15:34 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:24.032 [2024-07-12 22:15:34.287212] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:24.291 [2024-07-12 22:15:34.386718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:24.291 [2024-07-12 22:15:34.386722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.291 [2024-07-12 22:15:34.438949] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:24.291 [2024-07-12 22:15:34.439005] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:26.824 22:15:37 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:26.824 22:15:37 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:26.824 spdk_app_start Round 1 00:08:26.824 22:15:37 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3390636 /var/tmp/spdk-nbd.sock 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3390636 ']' 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:26.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:26.824 22:15:37 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:27.083 22:15:37 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:27.083 22:15:37 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:27.083 22:15:37 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:27.342 Malloc0 00:08:27.342 22:15:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:27.602 Malloc1 00:08:27.602 22:15:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:27.602 22:15:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:27.861 /dev/nbd0 00:08:27.861 22:15:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:27.861 22:15:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:27.861 1+0 records in 00:08:27.861 1+0 records out 00:08:27.861 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00020991 s, 19.5 MB/s 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.861 22:15:38 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:27.861 22:15:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.861 22:15:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:27.861 22:15:38 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:28.120 /dev/nbd1 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:28.120 1+0 records in 00:08:28.120 1+0 records out 00:08:28.120 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294081 s, 13.9 MB/s 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.120 22:15:38 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.120 22:15:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:28.379 { 00:08:28.379 "nbd_device": "/dev/nbd0", 00:08:28.379 "bdev_name": "Malloc0" 00:08:28.379 }, 00:08:28.379 { 00:08:28.379 "nbd_device": "/dev/nbd1", 00:08:28.379 "bdev_name": "Malloc1" 00:08:28.379 } 00:08:28.379 ]' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:28.379 { 00:08:28.379 "nbd_device": "/dev/nbd0", 00:08:28.379 "bdev_name": "Malloc0" 00:08:28.379 }, 00:08:28.379 { 00:08:28.379 "nbd_device": "/dev/nbd1", 00:08:28.379 "bdev_name": "Malloc1" 00:08:28.379 } 00:08:28.379 ]' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:28.379 /dev/nbd1' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:28.379 /dev/nbd1' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:28.379 256+0 records in 00:08:28.379 256+0 records out 00:08:28.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111186 s, 94.3 MB/s 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:28.379 256+0 records in 00:08:28.379 256+0 records out 00:08:28.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0186645 s, 56.2 MB/s 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:28.379 256+0 records in 00:08:28.379 256+0 records out 00:08:28.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0284022 s, 36.9 MB/s 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:28.379 22:15:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:28.638 22:15:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:29.206 22:15:39 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:29.465 22:15:39 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:29.465 22:15:39 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:29.724 22:15:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:29.984 [2024-07-12 22:15:40.053363] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:29.984 [2024-07-12 22:15:40.150985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.984 [2024-07-12 22:15:40.150990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.984 [2024-07-12 22:15:40.200726] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:29.984 [2024-07-12 22:15:40.200781] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:32.587 22:15:42 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:32.587 22:15:42 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:32.587 spdk_app_start Round 2 00:08:32.587 22:15:42 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3390636 /var/tmp/spdk-nbd.sock 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3390636 ']' 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:32.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:32.587 22:15:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:32.845 22:15:43 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.845 22:15:43 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:32.845 22:15:43 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:33.103 Malloc0 00:08:33.103 22:15:43 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:33.361 Malloc1 00:08:33.361 22:15:43 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:33.362 22:15:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:33.620 /dev/nbd0 00:08:33.620 22:15:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:33.620 22:15:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:33.620 22:15:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:33.621 1+0 records in 00:08:33.621 1+0 records out 00:08:33.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021178 s, 19.3 MB/s 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:33.621 22:15:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:33.621 22:15:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:33.621 22:15:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:33.621 22:15:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:33.879 /dev/nbd1 00:08:33.879 22:15:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:33.879 22:15:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:33.879 1+0 records in 00:08:33.879 1+0 records out 00:08:33.879 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232461 s, 17.6 MB/s 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:33.879 22:15:44 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:33.880 22:15:44 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:33.880 22:15:44 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:33.880 22:15:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:33.880 22:15:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:33.880 22:15:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:33.880 22:15:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:33.880 22:15:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:34.138 22:15:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:34.139 { 00:08:34.139 "nbd_device": "/dev/nbd0", 00:08:34.139 "bdev_name": "Malloc0" 00:08:34.139 }, 00:08:34.139 { 00:08:34.139 "nbd_device": "/dev/nbd1", 00:08:34.139 "bdev_name": "Malloc1" 00:08:34.139 } 00:08:34.139 ]' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:34.139 { 00:08:34.139 "nbd_device": "/dev/nbd0", 00:08:34.139 "bdev_name": "Malloc0" 00:08:34.139 }, 00:08:34.139 { 00:08:34.139 "nbd_device": "/dev/nbd1", 00:08:34.139 "bdev_name": "Malloc1" 00:08:34.139 } 00:08:34.139 ]' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:34.139 /dev/nbd1' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:34.139 /dev/nbd1' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:34.139 256+0 records in 00:08:34.139 256+0 records out 00:08:34.139 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109693 s, 95.6 MB/s 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:34.139 22:15:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:34.398 256+0 records in 00:08:34.398 256+0 records out 00:08:34.398 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0236284 s, 44.4 MB/s 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:34.398 256+0 records in 00:08:34.398 256+0 records out 00:08:34.398 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202941 s, 51.7 MB/s 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.398 22:15:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.657 22:15:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:34.916 22:15:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:35.189 22:15:45 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:35.189 22:15:45 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:35.447 22:15:45 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:35.706 [2024-07-12 22:15:45.899693] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:35.706 [2024-07-12 22:15:45.998170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:35.706 [2024-07-12 22:15:45.998176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.964 [2024-07-12 22:15:46.050706] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:35.964 [2024-07-12 22:15:46.050755] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:38.496 22:15:48 event.app_repeat -- event/event.sh@38 -- # waitforlisten 3390636 /var/tmp/spdk-nbd.sock 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 3390636 ']' 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:38.496 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:38.496 22:15:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:38.755 22:15:48 event.app_repeat -- event/event.sh@39 -- # killprocess 3390636 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 3390636 ']' 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 3390636 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3390636 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3390636' 00:08:38.755 killing process with pid 3390636 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@967 -- # kill 3390636 00:08:38.755 22:15:48 event.app_repeat -- common/autotest_common.sh@972 -- # wait 3390636 00:08:39.013 spdk_app_start is called in Round 0. 00:08:39.013 Shutdown signal received, stop current app iteration 00:08:39.013 Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 reinitialization... 00:08:39.013 spdk_app_start is called in Round 1. 00:08:39.013 Shutdown signal received, stop current app iteration 00:08:39.013 Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 reinitialization... 00:08:39.013 spdk_app_start is called in Round 2. 00:08:39.013 Shutdown signal received, stop current app iteration 00:08:39.013 Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 reinitialization... 00:08:39.013 spdk_app_start is called in Round 3. 00:08:39.013 Shutdown signal received, stop current app iteration 00:08:39.013 22:15:49 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:08:39.013 22:15:49 event.app_repeat -- event/event.sh@42 -- # return 0 00:08:39.013 00:08:39.013 real 0m18.490s 00:08:39.013 user 0m39.926s 00:08:39.013 sys 0m3.776s 00:08:39.013 22:15:49 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.013 22:15:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:39.013 ************************************ 00:08:39.013 END TEST app_repeat 00:08:39.013 ************************************ 00:08:39.013 22:15:49 event -- common/autotest_common.sh@1142 -- # return 0 00:08:39.013 22:15:49 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:08:39.013 00:08:39.013 real 0m27.978s 00:08:39.013 user 0m56.209s 00:08:39.013 sys 0m5.063s 00:08:39.013 22:15:49 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.013 22:15:49 event -- common/autotest_common.sh@10 -- # set +x 00:08:39.013 ************************************ 00:08:39.013 END TEST event 00:08:39.013 ************************************ 00:08:39.013 22:15:49 -- common/autotest_common.sh@1142 -- # return 0 00:08:39.013 22:15:49 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:39.013 22:15:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:39.014 22:15:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.014 22:15:49 -- common/autotest_common.sh@10 -- # set +x 00:08:39.014 ************************************ 00:08:39.014 START TEST thread 00:08:39.014 ************************************ 00:08:39.014 22:15:49 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:39.273 * Looking for test storage... 00:08:39.273 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:08:39.273 22:15:49 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:39.273 22:15:49 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:39.273 22:15:49 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.273 22:15:49 thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.273 ************************************ 00:08:39.273 START TEST thread_poller_perf 00:08:39.273 ************************************ 00:08:39.273 22:15:49 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:39.273 [2024-07-12 22:15:49.449607] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:39.273 [2024-07-12 22:15:49.449688] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3393330 ] 00:08:39.273 [2024-07-12 22:15:49.582339] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.532 [2024-07-12 22:15:49.684412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.532 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:40.468 ====================================== 00:08:40.468 busy:2313440596 (cyc) 00:08:40.468 total_run_count: 265000 00:08:40.468 tsc_hz: 2300000000 (cyc) 00:08:40.468 ====================================== 00:08:40.468 poller_cost: 8729 (cyc), 3795 (nsec) 00:08:40.468 00:08:40.468 real 0m1.372s 00:08:40.468 user 0m1.224s 00:08:40.468 sys 0m0.141s 00:08:40.468 22:15:50 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.468 22:15:50 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:40.468 ************************************ 00:08:40.468 END TEST thread_poller_perf 00:08:40.468 ************************************ 00:08:40.728 22:15:50 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:40.728 22:15:50 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:40.728 22:15:50 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:40.728 22:15:50 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.728 22:15:50 thread -- common/autotest_common.sh@10 -- # set +x 00:08:40.728 ************************************ 00:08:40.728 START TEST thread_poller_perf 00:08:40.728 ************************************ 00:08:40.728 22:15:50 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:40.728 [2024-07-12 22:15:50.895519] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:40.728 [2024-07-12 22:15:50.895584] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3393523 ] 00:08:40.728 [2024-07-12 22:15:51.025204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.988 [2024-07-12 22:15:51.124145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.988 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:41.925 ====================================== 00:08:41.925 busy:2302531422 (cyc) 00:08:41.925 total_run_count: 3492000 00:08:41.926 tsc_hz: 2300000000 (cyc) 00:08:41.926 ====================================== 00:08:41.926 poller_cost: 659 (cyc), 286 (nsec) 00:08:41.926 00:08:41.926 real 0m1.347s 00:08:41.926 user 0m1.208s 00:08:41.926 sys 0m0.132s 00:08:41.926 22:15:52 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.926 22:15:52 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:41.926 ************************************ 00:08:41.926 END TEST thread_poller_perf 00:08:41.926 ************************************ 00:08:42.185 22:15:52 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:42.185 22:15:52 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:08:42.185 00:08:42.185 real 0m2.983s 00:08:42.185 user 0m2.536s 00:08:42.185 sys 0m0.454s 00:08:42.185 22:15:52 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.185 22:15:52 thread -- common/autotest_common.sh@10 -- # set +x 00:08:42.185 ************************************ 00:08:42.185 END TEST thread 00:08:42.185 ************************************ 00:08:42.185 22:15:52 -- common/autotest_common.sh@1142 -- # return 0 00:08:42.185 22:15:52 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:42.185 22:15:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:42.185 22:15:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.185 22:15:52 -- common/autotest_common.sh@10 -- # set +x 00:08:42.185 ************************************ 00:08:42.185 START TEST accel 00:08:42.185 ************************************ 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:42.185 * Looking for test storage... 00:08:42.185 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:42.185 22:15:52 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:08:42.185 22:15:52 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:08:42.185 22:15:52 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:42.185 22:15:52 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3393770 00:08:42.185 22:15:52 accel -- accel/accel.sh@63 -- # waitforlisten 3393770 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@829 -- # '[' -z 3393770 ']' 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:42.185 22:15:52 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:42.185 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:42.185 22:15:52 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:42.185 22:15:52 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.185 22:15:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.185 22:15:52 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.185 22:15:52 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.185 22:15:52 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.185 22:15:52 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:42.185 22:15:52 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:42.185 22:15:52 accel -- accel/accel.sh@41 -- # jq -r . 00:08:42.185 [2024-07-12 22:15:52.502382] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:42.185 [2024-07-12 22:15:52.502458] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3393770 ] 00:08:42.444 [2024-07-12 22:15:52.632833] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.444 [2024-07-12 22:15:52.729632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@862 -- # return 0 00:08:43.398 22:15:53 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:43.398 22:15:53 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:43.398 22:15:53 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:43.398 22:15:53 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:43.398 22:15:53 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:43.398 22:15:53 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:43.398 22:15:53 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:43.398 22:15:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:43.398 22:15:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:43.398 22:15:53 accel -- accel/accel.sh@75 -- # killprocess 3393770 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@948 -- # '[' -z 3393770 ']' 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@952 -- # kill -0 3393770 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@953 -- # uname 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3393770 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3393770' 00:08:43.398 killing process with pid 3393770 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@967 -- # kill 3393770 00:08:43.398 22:15:53 accel -- common/autotest_common.sh@972 -- # wait 3393770 00:08:43.658 22:15:53 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:43.658 22:15:53 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:43.658 22:15:53 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:43.658 22:15:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.658 22:15:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:43.658 22:15:53 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:43.658 22:15:53 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:43.917 22:15:53 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:43.917 22:15:53 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:43.917 22:15:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:43.917 22:15:54 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:43.917 22:15:54 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:43.917 22:15:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.917 22:15:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:43.917 ************************************ 00:08:43.917 START TEST accel_missing_filename 00:08:43.917 ************************************ 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:43.917 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:43.917 22:15:54 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:43.917 [2024-07-12 22:15:54.116504] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:43.917 [2024-07-12 22:15:54.116581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3393987 ] 00:08:44.176 [2024-07-12 22:15:54.250171] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.176 [2024-07-12 22:15:54.355232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.176 [2024-07-12 22:15:54.434741] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:44.436 [2024-07-12 22:15:54.509317] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:08:44.436 A filename is required. 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:44.436 00:08:44.436 real 0m0.526s 00:08:44.436 user 0m0.350s 00:08:44.436 sys 0m0.194s 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.436 22:15:54 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:44.436 ************************************ 00:08:44.436 END TEST accel_missing_filename 00:08:44.436 ************************************ 00:08:44.436 22:15:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:44.436 22:15:54 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:44.436 22:15:54 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:44.436 22:15:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.436 22:15:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.436 ************************************ 00:08:44.436 START TEST accel_compress_verify 00:08:44.436 ************************************ 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:44.436 22:15:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:44.436 22:15:54 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:44.436 [2024-07-12 22:15:54.722934] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:44.436 [2024-07-12 22:15:54.723003] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3394174 ] 00:08:44.696 [2024-07-12 22:15:54.852064] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.696 [2024-07-12 22:15:54.955377] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.955 [2024-07-12 22:15:55.026501] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:44.955 [2024-07-12 22:15:55.100128] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:08:44.955 00:08:44.955 Compression does not support the verify option, aborting. 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:44.955 00:08:44.955 real 0m0.509s 00:08:44.955 user 0m0.326s 00:08:44.955 sys 0m0.210s 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.955 22:15:55 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:44.955 ************************************ 00:08:44.955 END TEST accel_compress_verify 00:08:44.955 ************************************ 00:08:44.955 22:15:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:44.955 22:15:55 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:44.955 22:15:55 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:44.955 22:15:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.955 22:15:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.955 ************************************ 00:08:44.955 START TEST accel_wrong_workload 00:08:44.955 ************************************ 00:08:44.955 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:08:44.955 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:08:44.955 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:44.955 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:45.214 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.214 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:45.214 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.214 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:45.214 22:15:55 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:45.214 Unsupported workload type: foobar 00:08:45.214 [2024-07-12 22:15:55.315518] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:45.214 accel_perf options: 00:08:45.214 [-h help message] 00:08:45.214 [-q queue depth per core] 00:08:45.214 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:45.214 [-T number of threads per core 00:08:45.214 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:45.214 [-t time in seconds] 00:08:45.215 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:45.215 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:45.215 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:45.215 [-l for compress/decompress workloads, name of uncompressed input file 00:08:45.215 [-S for crc32c workload, use this seed value (default 0) 00:08:45.215 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:45.215 [-f for fill workload, use this BYTE value (default 255) 00:08:45.215 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:45.215 [-y verify result if this switch is on] 00:08:45.215 [-a tasks to allocate per core (default: same value as -q)] 00:08:45.215 Can be used to spread operations across a wider range of memory. 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:45.215 00:08:45.215 real 0m0.044s 00:08:45.215 user 0m0.022s 00:08:45.215 sys 0m0.021s 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:45.215 22:15:55 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:45.215 ************************************ 00:08:45.215 END TEST accel_wrong_workload 00:08:45.215 ************************************ 00:08:45.215 Error: writing output failed: Broken pipe 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:45.215 22:15:55 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:45.215 ************************************ 00:08:45.215 START TEST accel_negative_buffers 00:08:45.215 ************************************ 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:45.215 22:15:55 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:45.215 -x option must be non-negative. 00:08:45.215 [2024-07-12 22:15:55.438214] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:45.215 accel_perf options: 00:08:45.215 [-h help message] 00:08:45.215 [-q queue depth per core] 00:08:45.215 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:45.215 [-T number of threads per core 00:08:45.215 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:45.215 [-t time in seconds] 00:08:45.215 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:45.215 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:45.215 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:45.215 [-l for compress/decompress workloads, name of uncompressed input file 00:08:45.215 [-S for crc32c workload, use this seed value (default 0) 00:08:45.215 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:45.215 [-f for fill workload, use this BYTE value (default 255) 00:08:45.215 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:45.215 [-y verify result if this switch is on] 00:08:45.215 [-a tasks to allocate per core (default: same value as -q)] 00:08:45.215 Can be used to spread operations across a wider range of memory. 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:45.215 00:08:45.215 real 0m0.041s 00:08:45.215 user 0m0.030s 00:08:45.215 sys 0m0.011s 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:45.215 22:15:55 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:45.215 ************************************ 00:08:45.215 END TEST accel_negative_buffers 00:08:45.215 ************************************ 00:08:45.215 Error: writing output failed: Broken pipe 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:45.215 22:15:55 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.215 22:15:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:45.215 ************************************ 00:08:45.215 START TEST accel_crc32c 00:08:45.215 ************************************ 00:08:45.215 22:15:55 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:45.215 22:15:55 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:45.474 [2024-07-12 22:15:55.559935] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:45.474 [2024-07-12 22:15:55.560000] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3394270 ] 00:08:45.474 [2024-07-12 22:15:55.689048] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.475 [2024-07-12 22:15:55.790090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:45.734 22:15:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:47.113 22:15:57 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:47.113 00:08:47.113 real 0m1.514s 00:08:47.113 user 0m1.324s 00:08:47.113 sys 0m0.190s 00:08:47.113 22:15:57 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.113 22:15:57 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:47.113 ************************************ 00:08:47.113 END TEST accel_crc32c 00:08:47.113 ************************************ 00:08:47.113 22:15:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:47.113 22:15:57 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:47.113 22:15:57 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:47.113 22:15:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.113 22:15:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:47.113 ************************************ 00:08:47.113 START TEST accel_crc32c_C2 00:08:47.113 ************************************ 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:47.113 [2024-07-12 22:15:57.156750] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:47.113 [2024-07-12 22:15:57.156815] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3394578 ] 00:08:47.113 [2024-07-12 22:15:57.285941] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.113 [2024-07-12 22:15:57.382821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.113 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:47.373 22:15:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:48.312 00:08:48.312 real 0m1.479s 00:08:48.312 user 0m1.294s 00:08:48.312 sys 0m0.193s 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.312 22:15:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:48.312 ************************************ 00:08:48.312 END TEST accel_crc32c_C2 00:08:48.312 ************************************ 00:08:48.312 22:15:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:48.312 22:15:58 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:48.312 22:15:58 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:48.312 22:15:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.312 22:15:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:48.573 ************************************ 00:08:48.573 START TEST accel_copy 00:08:48.573 ************************************ 00:08:48.573 22:15:58 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:48.573 22:15:58 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:48.573 [2024-07-12 22:15:58.711350] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:48.573 [2024-07-12 22:15:58.711408] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3394798 ] 00:08:48.573 [2024-07-12 22:15:58.841733] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.832 [2024-07-12 22:15:58.942769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:48.832 22:15:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:50.284 22:16:00 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:50.284 00:08:50.284 real 0m1.510s 00:08:50.284 user 0m1.314s 00:08:50.284 sys 0m0.200s 00:08:50.284 22:16:00 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:50.284 22:16:00 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:50.284 ************************************ 00:08:50.284 END TEST accel_copy 00:08:50.284 ************************************ 00:08:50.284 22:16:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:50.284 22:16:00 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:50.284 22:16:00 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:50.284 22:16:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:50.284 22:16:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:50.284 ************************************ 00:08:50.284 START TEST accel_fill 00:08:50.284 ************************************ 00:08:50.284 22:16:00 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:50.284 [2024-07-12 22:16:00.285570] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:50.284 [2024-07-12 22:16:00.285618] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3394997 ] 00:08:50.284 [2024-07-12 22:16:00.396966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.284 [2024-07-12 22:16:00.497512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.284 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.285 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.544 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:50.545 22:16:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:51.481 22:16:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:51.481 00:08:51.481 real 0m1.478s 00:08:51.481 user 0m1.297s 00:08:51.481 sys 0m0.181s 00:08:51.481 22:16:01 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:51.481 22:16:01 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:51.481 ************************************ 00:08:51.481 END TEST accel_fill 00:08:51.481 ************************************ 00:08:51.481 22:16:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:51.481 22:16:01 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:51.481 22:16:01 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:51.481 22:16:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:51.481 22:16:01 accel -- common/autotest_common.sh@10 -- # set +x 00:08:51.740 ************************************ 00:08:51.740 START TEST accel_copy_crc32c 00:08:51.740 ************************************ 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:51.740 22:16:01 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:51.740 [2024-07-12 22:16:01.863280] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:51.740 [2024-07-12 22:16:01.863406] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3395193 ] 00:08:51.740 [2024-07-12 22:16:02.058560] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.999 [2024-07-12 22:16:02.163999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:51.999 22:16:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:53.375 00:08:53.375 real 0m1.587s 00:08:53.375 user 0m1.351s 00:08:53.375 sys 0m0.238s 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.375 22:16:03 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:53.375 ************************************ 00:08:53.375 END TEST accel_copy_crc32c 00:08:53.375 ************************************ 00:08:53.375 22:16:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:53.375 22:16:03 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:53.375 22:16:03 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:53.375 22:16:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.375 22:16:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.375 ************************************ 00:08:53.375 START TEST accel_copy_crc32c_C2 00:08:53.375 ************************************ 00:08:53.375 22:16:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:53.375 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:53.375 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:53.375 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.375 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:53.376 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:53.376 [2024-07-12 22:16:03.516510] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:53.376 [2024-07-12 22:16:03.516569] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3395396 ] 00:08:53.376 [2024-07-12 22:16:03.646788] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.635 [2024-07-12 22:16:03.748512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:53.635 22:16:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.013 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.013 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.013 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.013 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:55.014 00:08:55.014 real 0m1.511s 00:08:55.014 user 0m1.326s 00:08:55.014 sys 0m0.189s 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:55.014 22:16:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:55.014 ************************************ 00:08:55.014 END TEST accel_copy_crc32c_C2 00:08:55.014 ************************************ 00:08:55.014 22:16:05 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:55.014 22:16:05 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:55.014 22:16:05 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:55.014 22:16:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.014 22:16:05 accel -- common/autotest_common.sh@10 -- # set +x 00:08:55.014 ************************************ 00:08:55.014 START TEST accel_dualcast 00:08:55.014 ************************************ 00:08:55.014 22:16:05 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:55.014 22:16:05 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:55.014 [2024-07-12 22:16:05.065500] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:55.014 [2024-07-12 22:16:05.065540] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3395660 ] 00:08:55.014 [2024-07-12 22:16:05.178560] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.014 [2024-07-12 22:16:05.279382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:55.273 22:16:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:56.209 22:16:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:56.210 22:16:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:56.210 22:16:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:56.210 22:16:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:56.210 00:08:56.210 real 0m1.472s 00:08:56.210 user 0m1.297s 00:08:56.210 sys 0m0.178s 00:08:56.210 22:16:06 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:56.210 22:16:06 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:56.210 ************************************ 00:08:56.210 END TEST accel_dualcast 00:08:56.210 ************************************ 00:08:56.469 22:16:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:56.469 22:16:06 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:56.469 22:16:06 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:56.469 22:16:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.469 22:16:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:56.469 ************************************ 00:08:56.469 START TEST accel_compare 00:08:56.469 ************************************ 00:08:56.469 22:16:06 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:56.469 22:16:06 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:56.469 [2024-07-12 22:16:06.632541] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:56.469 [2024-07-12 22:16:06.632601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3395946 ] 00:08:56.469 [2024-07-12 22:16:06.758720] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.728 [2024-07-12 22:16:06.860533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.728 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:56.729 22:16:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:58.107 22:16:08 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:58.107 00:08:58.107 real 0m1.506s 00:08:58.107 user 0m1.303s 00:08:58.107 sys 0m0.207s 00:08:58.107 22:16:08 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:58.107 22:16:08 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:58.107 ************************************ 00:08:58.107 END TEST accel_compare 00:08:58.107 ************************************ 00:08:58.107 22:16:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:58.107 22:16:08 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:58.107 22:16:08 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:58.107 22:16:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.107 22:16:08 accel -- common/autotest_common.sh@10 -- # set +x 00:08:58.107 ************************************ 00:08:58.107 START TEST accel_xor 00:08:58.107 ************************************ 00:08:58.107 22:16:08 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:58.107 22:16:08 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:58.107 [2024-07-12 22:16:08.219817] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:58.107 [2024-07-12 22:16:08.219877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3396142 ] 00:08:58.107 [2024-07-12 22:16:08.349253] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.366 [2024-07-12 22:16:08.447223] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:58.366 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:58.367 22:16:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:59.744 00:08:59.744 real 0m1.486s 00:08:59.744 user 0m1.308s 00:08:59.744 sys 0m0.184s 00:08:59.744 22:16:09 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.744 22:16:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:59.744 ************************************ 00:08:59.744 END TEST accel_xor 00:08:59.744 ************************************ 00:08:59.744 22:16:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:59.744 22:16:09 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:59.744 22:16:09 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:59.744 22:16:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:59.744 22:16:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:59.744 ************************************ 00:08:59.744 START TEST accel_xor 00:08:59.744 ************************************ 00:08:59.744 22:16:09 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:59.744 22:16:09 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:59.745 22:16:09 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:59.745 22:16:09 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:59.745 22:16:09 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:59.745 22:16:09 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:59.745 [2024-07-12 22:16:09.775638] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:08:59.745 [2024-07-12 22:16:09.775695] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3396345 ] 00:08:59.745 [2024-07-12 22:16:09.904080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.745 [2024-07-12 22:16:10.004254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.004 22:16:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:00.940 22:16:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:00.940 00:09:00.940 real 0m1.505s 00:09:00.940 user 0m1.311s 00:09:00.940 sys 0m0.194s 00:09:00.940 22:16:11 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:00.940 22:16:11 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:00.940 ************************************ 00:09:00.940 END TEST accel_xor 00:09:00.940 ************************************ 00:09:01.199 22:16:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:01.199 22:16:11 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:09:01.199 22:16:11 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:01.199 22:16:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.199 22:16:11 accel -- common/autotest_common.sh@10 -- # set +x 00:09:01.199 ************************************ 00:09:01.199 START TEST accel_dif_verify 00:09:01.199 ************************************ 00:09:01.199 22:16:11 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:01.199 22:16:11 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:09:01.199 [2024-07-12 22:16:11.357536] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:01.199 [2024-07-12 22:16:11.357599] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3396536 ] 00:09:01.199 [2024-07-12 22:16:11.486770] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.459 [2024-07-12 22:16:11.588107] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:01.459 22:16:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:09:02.846 22:16:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:02.846 00:09:02.846 real 0m1.515s 00:09:02.846 user 0m1.316s 00:09:02.846 sys 0m0.199s 00:09:02.846 22:16:12 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:02.846 22:16:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:09:02.846 ************************************ 00:09:02.846 END TEST accel_dif_verify 00:09:02.846 ************************************ 00:09:02.846 22:16:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:02.846 22:16:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:09:02.846 22:16:12 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:02.846 22:16:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:02.846 22:16:12 accel -- common/autotest_common.sh@10 -- # set +x 00:09:02.846 ************************************ 00:09:02.846 START TEST accel_dif_generate 00:09:02.846 ************************************ 00:09:02.846 22:16:12 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:09:02.846 22:16:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:09:02.846 [2024-07-12 22:16:12.937672] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:02.846 [2024-07-12 22:16:12.937729] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3396738 ] 00:09:02.846 [2024-07-12 22:16:13.064136] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.105 [2024-07-12 22:16:13.171952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:03.105 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:03.106 22:16:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:09:04.483 22:16:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:04.483 00:09:04.483 real 0m1.502s 00:09:04.483 user 0m1.320s 00:09:04.483 sys 0m0.186s 00:09:04.484 22:16:14 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.484 22:16:14 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:09:04.484 ************************************ 00:09:04.484 END TEST accel_dif_generate 00:09:04.484 ************************************ 00:09:04.484 22:16:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:04.484 22:16:14 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:09:04.484 22:16:14 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:04.484 22:16:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.484 22:16:14 accel -- common/autotest_common.sh@10 -- # set +x 00:09:04.484 ************************************ 00:09:04.484 START TEST accel_dif_generate_copy 00:09:04.484 ************************************ 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:04.484 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:09:04.484 [2024-07-12 22:16:14.524713] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:04.484 [2024-07-12 22:16:14.524776] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3397030 ] 00:09:04.484 [2024-07-12 22:16:14.653009] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.484 [2024-07-12 22:16:14.752853] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.742 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:04.743 22:16:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.679 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:05.680 00:09:05.680 real 0m1.499s 00:09:05.680 user 0m1.308s 00:09:05.680 sys 0m0.194s 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.680 22:16:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:09:05.680 ************************************ 00:09:05.680 END TEST accel_dif_generate_copy 00:09:05.680 ************************************ 00:09:05.938 22:16:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:05.938 22:16:16 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:09:05.938 22:16:16 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:05.938 22:16:16 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:05.938 22:16:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.938 22:16:16 accel -- common/autotest_common.sh@10 -- # set +x 00:09:05.938 ************************************ 00:09:05.938 START TEST accel_comp 00:09:05.938 ************************************ 00:09:05.938 22:16:16 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:05.938 22:16:16 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:09:05.938 [2024-07-12 22:16:16.110062] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:05.938 [2024-07-12 22:16:16.110129] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3397294 ] 00:09:05.938 [2024-07-12 22:16:16.241399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.196 [2024-07-12 22:16:16.346924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.196 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:06.197 22:16:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:07.574 22:16:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:07.574 00:09:07.574 real 0m1.525s 00:09:07.574 user 0m1.337s 00:09:07.574 sys 0m0.196s 00:09:07.574 22:16:17 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.574 22:16:17 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:09:07.574 ************************************ 00:09:07.574 END TEST accel_comp 00:09:07.574 ************************************ 00:09:07.574 22:16:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.574 22:16:17 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.574 22:16:17 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:07.574 22:16:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.574 22:16:17 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.574 ************************************ 00:09:07.574 START TEST accel_decomp 00:09:07.574 ************************************ 00:09:07.574 22:16:17 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:07.575 22:16:17 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:07.575 [2024-07-12 22:16:17.705480] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:07.575 [2024-07-12 22:16:17.705540] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3397491 ] 00:09:07.575 [2024-07-12 22:16:17.833819] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.834 [2024-07-12 22:16:17.935295] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:07.834 22:16:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:09.210 22:16:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:09.210 00:09:09.210 real 0m1.511s 00:09:09.210 user 0m1.322s 00:09:09.210 sys 0m0.192s 00:09:09.210 22:16:19 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.210 22:16:19 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:09.210 ************************************ 00:09:09.210 END TEST accel_decomp 00:09:09.210 ************************************ 00:09:09.210 22:16:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:09.210 22:16:19 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:09.210 22:16:19 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:09.210 22:16:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.210 22:16:19 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.210 ************************************ 00:09:09.210 START TEST accel_decomp_full 00:09:09.210 ************************************ 00:09:09.210 22:16:19 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:09.210 22:16:19 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:09.211 22:16:19 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:09.211 [2024-07-12 22:16:19.289607] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:09.211 [2024-07-12 22:16:19.289667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3397687 ] 00:09:09.211 [2024-07-12 22:16:19.418452] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.211 [2024-07-12 22:16:19.515650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.469 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:09.470 22:16:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:10.479 22:16:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:10.479 00:09:10.479 real 0m1.511s 00:09:10.479 user 0m1.320s 00:09:10.479 sys 0m0.188s 00:09:10.479 22:16:20 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.479 22:16:20 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:10.479 ************************************ 00:09:10.479 END TEST accel_decomp_full 00:09:10.479 ************************************ 00:09:10.479 22:16:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:10.479 22:16:20 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:10.479 22:16:20 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:10.479 22:16:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.479 22:16:20 accel -- common/autotest_common.sh@10 -- # set +x 00:09:10.738 ************************************ 00:09:10.738 START TEST accel_decomp_mcore 00:09:10.738 ************************************ 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:10.738 22:16:20 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:10.738 [2024-07-12 22:16:20.871458] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:10.738 [2024-07-12 22:16:20.871515] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3397891 ] 00:09:10.738 [2024-07-12 22:16:21.004620] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:10.997 [2024-07-12 22:16:21.110451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:10.997 [2024-07-12 22:16:21.110536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:10.997 [2024-07-12 22:16:21.110617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:10.997 [2024-07-12 22:16:21.110623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:10.997 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:10.998 22:16:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:12.376 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:12.377 00:09:12.377 real 0m1.526s 00:09:12.377 user 0m4.774s 00:09:12.377 sys 0m0.210s 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.377 22:16:22 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:12.377 ************************************ 00:09:12.377 END TEST accel_decomp_mcore 00:09:12.377 ************************************ 00:09:12.377 22:16:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:12.377 22:16:22 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:12.377 22:16:22 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:12.377 22:16:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.377 22:16:22 accel -- common/autotest_common.sh@10 -- # set +x 00:09:12.377 ************************************ 00:09:12.377 START TEST accel_decomp_full_mcore 00:09:12.377 ************************************ 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:12.377 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:12.377 [2024-07-12 22:16:22.476789] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:12.377 [2024-07-12 22:16:22.476848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3398148 ] 00:09:12.377 [2024-07-12 22:16:22.606861] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:12.636 [2024-07-12 22:16:22.712141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:12.636 [2024-07-12 22:16:22.712227] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:12.636 [2024-07-12 22:16:22.712303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:12.636 [2024-07-12 22:16:22.712309] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:12.637 22:16:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:14.013 00:09:14.013 real 0m1.530s 00:09:14.013 user 0m4.791s 00:09:14.013 sys 0m0.208s 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.013 22:16:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:14.013 ************************************ 00:09:14.013 END TEST accel_decomp_full_mcore 00:09:14.013 ************************************ 00:09:14.013 22:16:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:14.013 22:16:24 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:14.013 22:16:24 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:14.013 22:16:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:14.013 22:16:24 accel -- common/autotest_common.sh@10 -- # set +x 00:09:14.013 ************************************ 00:09:14.013 START TEST accel_decomp_mthread 00:09:14.013 ************************************ 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:14.013 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:14.013 [2024-07-12 22:16:24.080886] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:14.013 [2024-07-12 22:16:24.080950] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3398464 ] 00:09:14.013 [2024-07-12 22:16:24.209221] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.013 [2024-07-12 22:16:24.310694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:14.271 22:16:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:15.647 00:09:15.647 real 0m1.515s 00:09:15.647 user 0m1.318s 00:09:15.647 sys 0m0.200s 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:15.647 22:16:25 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:15.647 ************************************ 00:09:15.647 END TEST accel_decomp_mthread 00:09:15.647 ************************************ 00:09:15.647 22:16:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:15.647 22:16:25 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:15.647 22:16:25 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:15.647 22:16:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:15.647 22:16:25 accel -- common/autotest_common.sh@10 -- # set +x 00:09:15.647 ************************************ 00:09:15.647 START TEST accel_decomp_full_mthread 00:09:15.647 ************************************ 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:15.647 [2024-07-12 22:16:25.653348] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:15.647 [2024-07-12 22:16:25.653391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3398663 ] 00:09:15.647 [2024-07-12 22:16:25.762998] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.647 [2024-07-12 22:16:25.862474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.647 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:15.648 22:16:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:17.024 00:09:17.024 real 0m1.504s 00:09:17.024 user 0m1.336s 00:09:17.024 sys 0m0.172s 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.024 22:16:27 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:17.024 ************************************ 00:09:17.024 END TEST accel_decomp_full_mthread 00:09:17.024 ************************************ 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:17.024 22:16:27 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:09:17.024 22:16:27 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:09:17.024 22:16:27 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:09:17.024 22:16:27 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:17.024 22:16:27 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3398859 00:09:17.024 22:16:27 accel -- accel/accel.sh@63 -- # waitforlisten 3398859 00:09:17.024 22:16:27 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@829 -- # '[' -z 3398859 ']' 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:17.024 22:16:27 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:17.024 22:16:27 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:17.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:17.024 22:16:27 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:17.024 22:16:27 accel -- common/autotest_common.sh@10 -- # set +x 00:09:17.024 22:16:27 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:17.024 22:16:27 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:17.024 22:16:27 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:17.024 22:16:27 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:17.024 22:16:27 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:17.024 22:16:27 accel -- accel/accel.sh@41 -- # jq -r . 00:09:17.024 [2024-07-12 22:16:27.238347] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:17.024 [2024-07-12 22:16:27.238416] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3398859 ] 00:09:17.283 [2024-07-12 22:16:27.367548] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.283 [2024-07-12 22:16:27.465078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.216 [2024-07-12 22:16:28.212250] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:18.216 22:16:28 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:18.216 22:16:28 accel -- common/autotest_common.sh@862 -- # return 0 00:09:18.216 22:16:28 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:18.216 22:16:28 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:18.216 22:16:28 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:18.216 22:16:28 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:09:18.216 22:16:28 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:09:18.216 22:16:28 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:09:18.216 22:16:28 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:09:18.216 22:16:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:18.216 22:16:28 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:09:18.216 22:16:28 accel -- common/autotest_common.sh@10 -- # set +x 00:09:18.475 22:16:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:18.475 "method": "compressdev_scan_accel_module", 00:09:18.475 22:16:28 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:18.475 22:16:28 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:18.475 22:16:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:18.475 22:16:28 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:18.475 22:16:28 accel -- common/autotest_common.sh@10 -- # set +x 00:09:18.475 22:16:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.475 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.475 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.475 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:18.476 22:16:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:18.476 22:16:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:18.476 22:16:28 accel -- accel/accel.sh@75 -- # killprocess 3398859 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@948 -- # '[' -z 3398859 ']' 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@952 -- # kill -0 3398859 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@953 -- # uname 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3398859 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3398859' 00:09:18.476 killing process with pid 3398859 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@967 -- # kill 3398859 00:09:18.476 22:16:28 accel -- common/autotest_common.sh@972 -- # wait 3398859 00:09:19.041 22:16:29 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:19.041 22:16:29 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:19.041 22:16:29 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:19.041 22:16:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.041 22:16:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:19.041 ************************************ 00:09:19.041 START TEST accel_cdev_comp 00:09:19.041 ************************************ 00:09:19.041 22:16:29 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:19.041 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:19.041 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:09:19.041 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.041 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:19.042 22:16:29 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:09:19.042 [2024-07-12 22:16:29.159443] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:19.042 [2024-07-12 22:16:29.159512] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3399087 ] 00:09:19.042 [2024-07-12 22:16:29.290009] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.299 [2024-07-12 22:16:29.388094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.866 [2024-07-12 22:16:30.152207] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:19.866 [2024-07-12 22:16:30.154871] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1623080 PMD being used: compress_qat 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 [2024-07-12 22:16:30.159049] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1627e60 PMD being used: compress_qat 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:19.866 22:16:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.243 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:21.244 22:16:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:21.244 00:09:21.244 real 0m2.218s 00:09:21.244 user 0m1.643s 00:09:21.244 sys 0m0.579s 00:09:21.244 22:16:31 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.244 22:16:31 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:09:21.244 ************************************ 00:09:21.244 END TEST accel_cdev_comp 00:09:21.244 ************************************ 00:09:21.244 22:16:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:21.244 22:16:31 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:21.244 22:16:31 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:21.244 22:16:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.244 22:16:31 accel -- common/autotest_common.sh@10 -- # set +x 00:09:21.244 ************************************ 00:09:21.244 START TEST accel_cdev_decomp 00:09:21.244 ************************************ 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:21.244 22:16:31 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:21.244 [2024-07-12 22:16:31.450725] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:21.244 [2024-07-12 22:16:31.450785] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3399426 ] 00:09:21.503 [2024-07-12 22:16:31.579708] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.503 [2024-07-12 22:16:31.680208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.439 [2024-07-12 22:16:32.452744] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:22.439 [2024-07-12 22:16:32.455459] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1abb080 PMD being used: compress_qat 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 [2024-07-12 22:16:32.459696] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1abfe60 PMD being used: compress_qat 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:22.439 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:22.440 22:16:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:23.373 00:09:23.373 real 0m2.226s 00:09:23.373 user 0m1.624s 00:09:23.373 sys 0m0.598s 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.373 22:16:33 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:23.373 ************************************ 00:09:23.373 END TEST accel_cdev_decomp 00:09:23.373 ************************************ 00:09:23.373 22:16:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:23.373 22:16:33 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:23.373 22:16:33 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:23.373 22:16:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.373 22:16:33 accel -- common/autotest_common.sh@10 -- # set +x 00:09:23.632 ************************************ 00:09:23.632 START TEST accel_cdev_decomp_full 00:09:23.632 ************************************ 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:23.632 22:16:33 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:23.632 [2024-07-12 22:16:33.758453] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:23.632 [2024-07-12 22:16:33.758517] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3399804 ] 00:09:23.632 [2024-07-12 22:16:33.872401] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.891 [2024-07-12 22:16:33.978944] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.459 [2024-07-12 22:16:34.749484] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:24.459 [2024-07-12 22:16:34.752168] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1af7080 PMD being used: compress_qat 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 [2024-07-12 22:16:34.755609] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1af6ce0 PMD being used: compress_qat 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:24.459 22:16:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:25.838 00:09:25.838 real 0m2.217s 00:09:25.838 user 0m1.625s 00:09:25.838 sys 0m0.590s 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.838 22:16:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:25.838 ************************************ 00:09:25.838 END TEST accel_cdev_decomp_full 00:09:25.838 ************************************ 00:09:25.838 22:16:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:25.838 22:16:35 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:25.838 22:16:35 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:25.838 22:16:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.838 22:16:35 accel -- common/autotest_common.sh@10 -- # set +x 00:09:25.838 ************************************ 00:09:25.838 START TEST accel_cdev_decomp_mcore 00:09:25.838 ************************************ 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:25.838 22:16:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:25.838 [2024-07-12 22:16:36.057910] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:25.838 [2024-07-12 22:16:36.058068] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3400098 ] 00:09:26.097 [2024-07-12 22:16:36.253180] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:26.097 [2024-07-12 22:16:36.361904] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:26.097 [2024-07-12 22:16:36.362005] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:26.097 [2024-07-12 22:16:36.362031] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:26.097 [2024-07-12 22:16:36.362039] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.033 [2024-07-12 22:16:37.127581] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:27.033 [2024-07-12 22:16:37.130367] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x155e720 PMD being used: compress_qat 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.033 [2024-07-12 22:16:37.136016] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25d419b8b0 PMD being used: compress_qat 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.033 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.033 [2024-07-12 22:16:37.136697] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25cc19b8b0 PMD being used: compress_qat 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 [2024-07-12 22:16:37.137778] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15639f0 PMD being used: compress_qat 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.034 [2024-07-12 22:16:37.138022] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25c419b8b0 PMD being used: compress_qat 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:27.034 22:16:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:28.412 00:09:28.412 real 0m2.318s 00:09:28.412 user 0m7.250s 00:09:28.412 sys 0m0.640s 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.412 22:16:38 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:28.412 ************************************ 00:09:28.412 END TEST accel_cdev_decomp_mcore 00:09:28.412 ************************************ 00:09:28.412 22:16:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:28.412 22:16:38 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:28.412 22:16:38 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:28.412 22:16:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.412 22:16:38 accel -- common/autotest_common.sh@10 -- # set +x 00:09:28.412 ************************************ 00:09:28.412 START TEST accel_cdev_decomp_full_mcore 00:09:28.412 ************************************ 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:28.412 22:16:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:28.412 [2024-07-12 22:16:38.447707] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:28.412 [2024-07-12 22:16:38.447772] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3400382 ] 00:09:28.412 [2024-07-12 22:16:38.562523] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:28.412 [2024-07-12 22:16:38.671905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:28.412 [2024-07-12 22:16:38.671996] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:28.412 [2024-07-12 22:16:38.672023] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:28.412 [2024-07-12 22:16:38.672027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.353 [2024-07-12 22:16:39.425866] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:29.353 [2024-07-12 22:16:39.428467] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f4d720 PMD being used: compress_qat 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 [2024-07-12 22:16:39.433135] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7face819b8b0 PMD being used: compress_qat 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:29.353 [2024-07-12 22:16:39.433836] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7face019b8b0 PMD being used: compress_qat 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 [2024-07-12 22:16:39.434932] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f50a30 PMD being used: compress_qat 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 [2024-07-12 22:16:39.435157] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7facd819b8b0 PMD being used: compress_qat 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:29.353 22:16:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.367 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:30.368 00:09:30.368 real 0m2.213s 00:09:30.368 user 0m7.171s 00:09:30.368 sys 0m0.580s 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.368 22:16:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:30.368 ************************************ 00:09:30.368 END TEST accel_cdev_decomp_full_mcore 00:09:30.368 ************************************ 00:09:30.368 22:16:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:30.368 22:16:40 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:30.368 22:16:40 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:30.368 22:16:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.368 22:16:40 accel -- common/autotest_common.sh@10 -- # set +x 00:09:30.628 ************************************ 00:09:30.628 START TEST accel_cdev_decomp_mthread 00:09:30.628 ************************************ 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:30.628 22:16:40 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:30.628 [2024-07-12 22:16:40.737976] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:30.628 [2024-07-12 22:16:40.738046] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3400751 ] 00:09:30.628 [2024-07-12 22:16:40.869229] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.887 [2024-07-12 22:16:40.970884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.454 [2024-07-12 22:16:41.740710] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:31.454 [2024-07-12 22:16:41.743455] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15b5080 PMD being used: compress_qat 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.454 [2024-07-12 22:16:41.748402] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15ba2a0 PMD being used: compress_qat 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.454 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 [2024-07-12 22:16:41.750960] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16dd0f0 PMD being used: compress_qat 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:31.455 22:16:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.832 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:32.833 00:09:32.833 real 0m2.232s 00:09:32.833 user 0m0.021s 00:09:32.833 sys 0m0.003s 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:32.833 22:16:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:32.833 ************************************ 00:09:32.833 END TEST accel_cdev_decomp_mthread 00:09:32.833 ************************************ 00:09:32.833 22:16:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:32.833 22:16:42 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:32.833 22:16:42 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:32.833 22:16:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:32.833 22:16:42 accel -- common/autotest_common.sh@10 -- # set +x 00:09:32.833 ************************************ 00:09:32.833 START TEST accel_cdev_decomp_full_mthread 00:09:32.833 ************************************ 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:32.833 22:16:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:32.833 [2024-07-12 22:16:43.023950] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:32.833 [2024-07-12 22:16:43.024009] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3401118 ] 00:09:32.833 [2024-07-12 22:16:43.150430] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.092 [2024-07-12 22:16:43.247791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:34.028 [2024-07-12 22:16:44.010320] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:34.028 [2024-07-12 22:16:44.012839] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13fc080 PMD being used: compress_qat 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:34.028 [2024-07-12 22:16:44.017048] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13ff3b0 PMD being used: compress_qat 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 [2024-07-12 22:16:44.019692] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1523cc0 PMD being used: compress_qat 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.028 22:16:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:34.964 00:09:34.964 real 0m2.211s 00:09:34.964 user 0m0.012s 00:09:34.964 sys 0m0.001s 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:34.964 22:16:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:34.964 ************************************ 00:09:34.964 END TEST accel_cdev_decomp_full_mthread 00:09:34.964 ************************************ 00:09:34.964 22:16:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:34.964 22:16:45 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:09:34.964 22:16:45 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:34.964 22:16:45 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:34.964 22:16:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:34.964 22:16:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:34.964 22:16:45 accel -- accel/accel.sh@137 -- # build_accel_config 00:09:34.964 22:16:45 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:34.964 22:16:45 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:34.964 22:16:45 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:34.964 22:16:45 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:34.964 22:16:45 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:34.964 22:16:45 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:34.964 22:16:45 accel -- accel/accel.sh@41 -- # jq -r . 00:09:34.964 ************************************ 00:09:34.964 START TEST accel_dif_functional_tests 00:09:34.964 ************************************ 00:09:34.964 22:16:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:35.224 [2024-07-12 22:16:45.333086] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:35.224 [2024-07-12 22:16:45.333145] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3401326 ] 00:09:35.224 [2024-07-12 22:16:45.461818] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:35.483 [2024-07-12 22:16:45.566854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:35.483 [2024-07-12 22:16:45.566945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:35.483 [2024-07-12 22:16:45.566952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.483 00:09:35.483 00:09:35.483 CUnit - A unit testing framework for C - Version 2.1-3 00:09:35.483 http://cunit.sourceforge.net/ 00:09:35.483 00:09:35.483 00:09:35.483 Suite: accel_dif 00:09:35.483 Test: verify: DIF generated, GUARD check ...passed 00:09:35.483 Test: verify: DIF generated, APPTAG check ...passed 00:09:35.483 Test: verify: DIF generated, REFTAG check ...passed 00:09:35.483 Test: verify: DIF not generated, GUARD check ...[2024-07-12 22:16:45.668531] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:35.483 passed 00:09:35.483 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 22:16:45.668600] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:35.483 passed 00:09:35.483 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 22:16:45.668637] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:35.483 passed 00:09:35.483 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:35.483 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 22:16:45.668716] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:35.483 passed 00:09:35.483 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:35.483 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:35.483 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:35.483 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 22:16:45.668892] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:35.483 passed 00:09:35.483 Test: verify copy: DIF generated, GUARD check ...passed 00:09:35.483 Test: verify copy: DIF generated, APPTAG check ...passed 00:09:35.483 Test: verify copy: DIF generated, REFTAG check ...passed 00:09:35.483 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 22:16:45.669089] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:35.483 passed 00:09:35.483 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 22:16:45.669130] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:35.483 passed 00:09:35.483 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 22:16:45.669169] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:35.483 passed 00:09:35.483 Test: generate copy: DIF generated, GUARD check ...passed 00:09:35.483 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:35.483 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:35.483 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:35.483 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:35.483 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:35.483 Test: generate copy: iovecs-len validate ...[2024-07-12 22:16:45.669422] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:35.483 passed 00:09:35.483 Test: generate copy: buffer alignment validate ...passed 00:09:35.483 00:09:35.483 Run Summary: Type Total Ran Passed Failed Inactive 00:09:35.483 suites 1 1 n/a 0 0 00:09:35.483 tests 26 26 26 0 0 00:09:35.483 asserts 115 115 115 0 n/a 00:09:35.483 00:09:35.483 Elapsed time = 0.003 seconds 00:09:35.743 00:09:35.743 real 0m0.595s 00:09:35.743 user 0m0.789s 00:09:35.743 sys 0m0.227s 00:09:35.743 22:16:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.743 22:16:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:09:35.743 ************************************ 00:09:35.743 END TEST accel_dif_functional_tests 00:09:35.743 ************************************ 00:09:35.743 22:16:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:35.743 00:09:35.743 real 0m53.574s 00:09:35.743 user 1m1.773s 00:09:35.743 sys 0m11.874s 00:09:35.743 22:16:45 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.743 22:16:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:35.743 ************************************ 00:09:35.743 END TEST accel 00:09:35.743 ************************************ 00:09:35.743 22:16:45 -- common/autotest_common.sh@1142 -- # return 0 00:09:35.743 22:16:45 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:35.743 22:16:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:35.743 22:16:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:35.743 22:16:45 -- common/autotest_common.sh@10 -- # set +x 00:09:35.743 ************************************ 00:09:35.743 START TEST accel_rpc 00:09:35.743 ************************************ 00:09:35.743 22:16:45 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:36.002 * Looking for test storage... 00:09:36.002 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:36.002 22:16:46 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:36.002 22:16:46 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3401555 00:09:36.002 22:16:46 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:36.002 22:16:46 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 3401555 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 3401555 ']' 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:36.002 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:36.002 22:16:46 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:36.002 [2024-07-12 22:16:46.172007] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:36.002 [2024-07-12 22:16:46.172086] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3401555 ] 00:09:36.002 [2024-07-12 22:16:46.295998] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.262 [2024-07-12 22:16:46.399466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.829 22:16:47 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:36.829 22:16:47 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:09:36.829 22:16:47 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:09:36.829 22:16:47 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:09:36.829 22:16:47 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:09:36.829 22:16:47 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:09:36.829 22:16:47 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:09:36.829 22:16:47 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:36.829 22:16:47 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.829 22:16:47 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:36.829 ************************************ 00:09:36.829 START TEST accel_assign_opcode 00:09:36.829 ************************************ 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:36.829 [2024-07-12 22:16:47.145874] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.829 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:36.829 [2024-07-12 22:16:47.153874] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:37.088 software 00:09:37.088 00:09:37.088 real 0m0.268s 00:09:37.088 user 0m0.046s 00:09:37.088 sys 0m0.012s 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.088 22:16:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:37.088 ************************************ 00:09:37.088 END TEST accel_assign_opcode 00:09:37.088 ************************************ 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:09:37.348 22:16:47 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 3401555 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 3401555 ']' 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 3401555 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3401555 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3401555' 00:09:37.348 killing process with pid 3401555 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@967 -- # kill 3401555 00:09:37.348 22:16:47 accel_rpc -- common/autotest_common.sh@972 -- # wait 3401555 00:09:37.607 00:09:37.607 real 0m1.901s 00:09:37.607 user 0m1.966s 00:09:37.607 sys 0m0.598s 00:09:37.607 22:16:47 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.607 22:16:47 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:37.607 ************************************ 00:09:37.607 END TEST accel_rpc 00:09:37.607 ************************************ 00:09:37.866 22:16:47 -- common/autotest_common.sh@1142 -- # return 0 00:09:37.866 22:16:47 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:37.866 22:16:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:37.866 22:16:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.866 22:16:47 -- common/autotest_common.sh@10 -- # set +x 00:09:37.866 ************************************ 00:09:37.866 START TEST app_cmdline 00:09:37.866 ************************************ 00:09:37.866 22:16:47 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:37.866 * Looking for test storage... 00:09:37.866 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:37.866 22:16:48 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:09:37.866 22:16:48 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=3401814 00:09:37.866 22:16:48 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 3401814 00:09:37.866 22:16:48 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 3401814 ']' 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:37.866 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:37.866 22:16:48 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:37.866 [2024-07-12 22:16:48.161416] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:37.866 [2024-07-12 22:16:48.161496] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3401814 ] 00:09:38.125 [2024-07-12 22:16:48.292444] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:38.125 [2024-07-12 22:16:48.399661] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:39.062 { 00:09:39.062 "version": "SPDK v24.09-pre git sha1 9b8dc23b2", 00:09:39.062 "fields": { 00:09:39.062 "major": 24, 00:09:39.062 "minor": 9, 00:09:39.062 "patch": 0, 00:09:39.062 "suffix": "-pre", 00:09:39.062 "commit": "9b8dc23b2" 00:09:39.062 } 00:09:39.062 } 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@26 -- # sort 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:39.062 22:16:49 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:39.062 22:16:49 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:39.321 request: 00:09:39.321 { 00:09:39.321 "method": "env_dpdk_get_mem_stats", 00:09:39.321 "req_id": 1 00:09:39.321 } 00:09:39.321 Got JSON-RPC error response 00:09:39.321 response: 00:09:39.321 { 00:09:39.321 "code": -32601, 00:09:39.321 "message": "Method not found" 00:09:39.321 } 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:39.321 22:16:49 app_cmdline -- app/cmdline.sh@1 -- # killprocess 3401814 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 3401814 ']' 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 3401814 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:39.321 22:16:49 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3401814 00:09:39.580 22:16:49 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:39.580 22:16:49 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:39.580 22:16:49 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3401814' 00:09:39.580 killing process with pid 3401814 00:09:39.580 22:16:49 app_cmdline -- common/autotest_common.sh@967 -- # kill 3401814 00:09:39.580 22:16:49 app_cmdline -- common/autotest_common.sh@972 -- # wait 3401814 00:09:39.839 00:09:39.839 real 0m2.040s 00:09:39.839 user 0m2.470s 00:09:39.839 sys 0m0.617s 00:09:39.839 22:16:50 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:39.839 22:16:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:39.839 ************************************ 00:09:39.839 END TEST app_cmdline 00:09:39.839 ************************************ 00:09:39.839 22:16:50 -- common/autotest_common.sh@1142 -- # return 0 00:09:39.839 22:16:50 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:39.839 22:16:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:39.839 22:16:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:39.839 22:16:50 -- common/autotest_common.sh@10 -- # set +x 00:09:39.839 ************************************ 00:09:39.839 START TEST version 00:09:39.839 ************************************ 00:09:39.839 22:16:50 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:40.099 * Looking for test storage... 00:09:40.099 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:40.099 22:16:50 version -- app/version.sh@17 -- # get_header_version major 00:09:40.099 22:16:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # cut -f2 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # tr -d '"' 00:09:40.099 22:16:50 version -- app/version.sh@17 -- # major=24 00:09:40.099 22:16:50 version -- app/version.sh@18 -- # get_header_version minor 00:09:40.099 22:16:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # cut -f2 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # tr -d '"' 00:09:40.099 22:16:50 version -- app/version.sh@18 -- # minor=9 00:09:40.099 22:16:50 version -- app/version.sh@19 -- # get_header_version patch 00:09:40.099 22:16:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # cut -f2 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # tr -d '"' 00:09:40.099 22:16:50 version -- app/version.sh@19 -- # patch=0 00:09:40.099 22:16:50 version -- app/version.sh@20 -- # get_header_version suffix 00:09:40.099 22:16:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # cut -f2 00:09:40.099 22:16:50 version -- app/version.sh@14 -- # tr -d '"' 00:09:40.099 22:16:50 version -- app/version.sh@20 -- # suffix=-pre 00:09:40.099 22:16:50 version -- app/version.sh@22 -- # version=24.9 00:09:40.099 22:16:50 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:40.099 22:16:50 version -- app/version.sh@28 -- # version=24.9rc0 00:09:40.099 22:16:50 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:40.099 22:16:50 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:40.099 22:16:50 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:40.099 22:16:50 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:40.099 00:09:40.099 real 0m0.192s 00:09:40.099 user 0m0.109s 00:09:40.099 sys 0m0.131s 00:09:40.099 22:16:50 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:40.099 22:16:50 version -- common/autotest_common.sh@10 -- # set +x 00:09:40.099 ************************************ 00:09:40.099 END TEST version 00:09:40.099 ************************************ 00:09:40.099 22:16:50 -- common/autotest_common.sh@1142 -- # return 0 00:09:40.099 22:16:50 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:09:40.099 22:16:50 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:40.099 22:16:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:40.099 22:16:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:40.099 22:16:50 -- common/autotest_common.sh@10 -- # set +x 00:09:40.099 ************************************ 00:09:40.099 START TEST blockdev_general 00:09:40.099 ************************************ 00:09:40.099 22:16:50 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:40.359 * Looking for test storage... 00:09:40.359 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:40.359 22:16:50 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3402283 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:40.359 22:16:50 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 3402283 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 3402283 ']' 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:40.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:40.360 22:16:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:40.360 [2024-07-12 22:16:50.562482] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:40.360 [2024-07-12 22:16:50.562553] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3402283 ] 00:09:40.618 [2024-07-12 22:16:50.689834] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:40.618 [2024-07-12 22:16:50.798838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.187 22:16:51 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:41.187 22:16:51 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:41.187 22:16:51 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:09:41.187 22:16:51 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:09:41.187 22:16:51 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:41.187 22:16:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.187 22:16:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.447 [2024-07-12 22:16:51.666014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:41.447 [2024-07-12 22:16:51.666071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:41.447 00:09:41.447 [2024-07-12 22:16:51.674006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:41.447 [2024-07-12 22:16:51.674042] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:41.447 00:09:41.447 Malloc0 00:09:41.447 Malloc1 00:09:41.447 Malloc2 00:09:41.447 Malloc3 00:09:41.447 Malloc4 00:09:41.447 Malloc5 00:09:41.706 Malloc6 00:09:41.706 Malloc7 00:09:41.706 Malloc8 00:09:41.706 Malloc9 00:09:41.706 [2024-07-12 22:16:51.822902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:41.706 [2024-07-12 22:16:51.822955] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:41.706 [2024-07-12 22:16:51.822983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc29350 00:09:41.706 [2024-07-12 22:16:51.822999] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:41.706 [2024-07-12 22:16:51.824412] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:41.706 [2024-07-12 22:16:51.824443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:41.706 TestPT 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:41.706 5000+0 records in 00:09:41.706 5000+0 records out 00:09:41.706 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0253471 s, 404 MB/s 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.706 AIO0 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:52 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.706 22:16:52 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:09:41.706 22:16:52 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:09:41.706 22:16:52 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.706 22:16:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:41.966 22:16:52 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.966 22:16:52 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:09:41.966 22:16:52 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:09:41.967 22:16:52 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ae196e9f-543d-41a6-bffa-6a68ff3a6532"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae196e9f-543d-41a6-bffa-6a68ff3a6532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e36572d6-d4e9-5019-99fd-98556fa1c2b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e36572d6-d4e9-5019-99fd-98556fa1c2b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e4d42630-ec49-5c20-8b43-d259fe7da222"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e4d42630-ec49-5c20-8b43-d259fe7da222",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "202e34ce-32e9-5aee-8b80-bbdcbcf72816"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "202e34ce-32e9-5aee-8b80-bbdcbcf72816",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e22f1047-8289-57e0-880d-fb66f2ebbf2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e22f1047-8289-57e0-880d-fb66f2ebbf2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "2f6cb6b0-6109-5343-a3cf-1abc380217f1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2f6cb6b0-6109-5343-a3cf-1abc380217f1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cb275119-56e3-5141-b0a1-7c6fc29be7d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cb275119-56e3-5141-b0a1-7c6fc29be7d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1c5ff529-2d37-5e37-8979-4472172373d6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1c5ff529-2d37-5e37-8979-4472172373d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "e04a7d64-17ea-5cb3-969f-4768e1852c9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e04a7d64-17ea-5cb3-969f-4768e1852c9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "82207083-d649-5185-bc0a-8944c83f132a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "82207083-d649-5185-bc0a-8944c83f132a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "b131dffa-6b7f-5914-9b5e-81fb928f2687"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b131dffa-6b7f-5914-9b5e-81fb928f2687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "49b1f3aa-a753-4e29-ba43-8b9c6d056607"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1b56e5d5-60c9-4d89-9181-84e05ad68d59",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5ec7940d-9f51-4001-b493-f7ab5dffcfed",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fa972a55-2c46-4da1-b5aa-afd16cfc527d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c1fcf652-f04c-4416-9ab6-e0cf63c3ba6c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9d438139-ec64-495a-8784-b5f5738b985c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "9fa6c3e3-e845-4e31-b9b0-7c6614842f68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "85260974-9020-4e32-8fa2-e17299f382a1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "71ed0ede-a550-4bb7-a37c-e69dce76ae18"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "71ed0ede-a550-4bb7-a37c-e69dce76ae18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:41.967 22:16:52 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:09:41.967 22:16:52 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:09:41.967 22:16:52 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:09:41.967 22:16:52 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 3402283 00:09:41.967 22:16:52 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 3402283 ']' 00:09:41.967 22:16:52 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 3402283 00:09:41.967 22:16:52 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3402283 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3402283' 00:09:42.226 killing process with pid 3402283 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@967 -- # kill 3402283 00:09:42.226 22:16:52 blockdev_general -- common/autotest_common.sh@972 -- # wait 3402283 00:09:42.485 22:16:52 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:42.485 22:16:52 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:42.485 22:16:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:42.485 22:16:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:42.485 22:16:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:42.743 ************************************ 00:09:42.743 START TEST bdev_hello_world 00:09:42.743 ************************************ 00:09:42.743 22:16:52 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:42.743 [2024-07-12 22:16:52.882240] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:42.743 [2024-07-12 22:16:52.882303] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3402653 ] 00:09:42.743 [2024-07-12 22:16:53.008241] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:43.001 [2024-07-12 22:16:53.109306] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.001 [2024-07-12 22:16:53.268175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:43.001 [2024-07-12 22:16:53.268237] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:43.001 [2024-07-12 22:16:53.268256] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:43.002 [2024-07-12 22:16:53.276180] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:43.002 [2024-07-12 22:16:53.276213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:43.002 [2024-07-12 22:16:53.284191] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:43.002 [2024-07-12 22:16:53.284219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:43.261 [2024-07-12 22:16:53.361295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:43.261 [2024-07-12 22:16:53.361351] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:43.261 [2024-07-12 22:16:53.361375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1a3c0 00:09:43.261 [2024-07-12 22:16:53.361395] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:43.261 [2024-07-12 22:16:53.362884] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:43.261 [2024-07-12 22:16:53.362915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:43.261 [2024-07-12 22:16:53.516370] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:43.261 [2024-07-12 22:16:53.516453] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:43.261 [2024-07-12 22:16:53.516526] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:43.261 [2024-07-12 22:16:53.516623] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:43.261 [2024-07-12 22:16:53.516719] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:43.261 [2024-07-12 22:16:53.516761] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:43.261 [2024-07-12 22:16:53.516845] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:43.261 00:09:43.261 [2024-07-12 22:16:53.516901] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:43.829 00:09:43.829 real 0m1.021s 00:09:43.829 user 0m0.660s 00:09:43.829 sys 0m0.318s 00:09:43.829 22:16:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:43.829 22:16:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:43.829 ************************************ 00:09:43.829 END TEST bdev_hello_world 00:09:43.829 ************************************ 00:09:43.829 22:16:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:43.829 22:16:53 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:09:43.829 22:16:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:43.829 22:16:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:43.829 22:16:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:43.829 ************************************ 00:09:43.829 START TEST bdev_bounds 00:09:43.829 ************************************ 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3402844 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3402844' 00:09:43.829 Process bdevio pid: 3402844 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3402844 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3402844 ']' 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:43.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:43.829 22:16:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:43.829 [2024-07-12 22:16:54.016647] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:43.829 [2024-07-12 22:16:54.016786] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3402844 ] 00:09:44.087 [2024-07-12 22:16:54.213701] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:44.087 [2024-07-12 22:16:54.318684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:44.087 [2024-07-12 22:16:54.318770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:44.087 [2024-07-12 22:16:54.318774] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.345 [2024-07-12 22:16:54.480361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:44.345 [2024-07-12 22:16:54.480426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:44.345 [2024-07-12 22:16:54.480447] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:44.345 [2024-07-12 22:16:54.488374] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:44.345 [2024-07-12 22:16:54.488407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:44.345 [2024-07-12 22:16:54.496384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:44.345 [2024-07-12 22:16:54.496413] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:44.345 [2024-07-12 22:16:54.573609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:44.345 [2024-07-12 22:16:54.573662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:44.345 [2024-07-12 22:16:54.573687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17030c0 00:09:44.345 [2024-07-12 22:16:54.573705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:44.345 [2024-07-12 22:16:54.575209] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:44.345 [2024-07-12 22:16:54.575241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:44.603 22:16:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:44.603 22:16:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:44.603 22:16:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:44.863 I/O targets: 00:09:44.863 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:44.863 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:44.863 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:44.863 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:44.863 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:44.863 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:44.863 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:44.863 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:44.863 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:44.863 00:09:44.863 00:09:44.863 CUnit - A unit testing framework for C - Version 2.1-3 00:09:44.863 http://cunit.sourceforge.net/ 00:09:44.863 00:09:44.863 00:09:44.863 Suite: bdevio tests on: AIO0 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: raid1 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: concat0 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: raid0 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: TestPT 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: Malloc2p7 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.863 Test: blockdev comparev and writev ...passed 00:09:44.863 Test: blockdev nvme passthru rw ...passed 00:09:44.863 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.863 Test: blockdev nvme admin passthru ...passed 00:09:44.863 Test: blockdev copy ...passed 00:09:44.863 Suite: bdevio tests on: Malloc2p6 00:09:44.863 Test: blockdev write read block ...passed 00:09:44.863 Test: blockdev write zeroes read block ...passed 00:09:44.863 Test: blockdev write zeroes read no split ...passed 00:09:44.863 Test: blockdev write zeroes read split ...passed 00:09:44.863 Test: blockdev write zeroes read split partial ...passed 00:09:44.863 Test: blockdev reset ...passed 00:09:44.863 Test: blockdev write read 8 blocks ...passed 00:09:44.863 Test: blockdev write read size > 128k ...passed 00:09:44.863 Test: blockdev write read invalid size ...passed 00:09:44.863 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.863 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.863 Test: blockdev write read max offset ...passed 00:09:44.863 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.863 Test: blockdev writev readv 8 blocks ...passed 00:09:44.863 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.863 Test: blockdev writev readv block ...passed 00:09:44.863 Test: blockdev writev readv size > 128k ...passed 00:09:44.863 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.864 Test: blockdev comparev and writev ...passed 00:09:44.864 Test: blockdev nvme passthru rw ...passed 00:09:44.864 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.864 Test: blockdev nvme admin passthru ...passed 00:09:44.864 Test: blockdev copy ...passed 00:09:44.864 Suite: bdevio tests on: Malloc2p5 00:09:44.864 Test: blockdev write read block ...passed 00:09:44.864 Test: blockdev write zeroes read block ...passed 00:09:44.864 Test: blockdev write zeroes read no split ...passed 00:09:44.864 Test: blockdev write zeroes read split ...passed 00:09:44.864 Test: blockdev write zeroes read split partial ...passed 00:09:44.864 Test: blockdev reset ...passed 00:09:44.864 Test: blockdev write read 8 blocks ...passed 00:09:44.864 Test: blockdev write read size > 128k ...passed 00:09:44.864 Test: blockdev write read invalid size ...passed 00:09:44.864 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.864 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.864 Test: blockdev write read max offset ...passed 00:09:44.864 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.864 Test: blockdev writev readv 8 blocks ...passed 00:09:44.864 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.864 Test: blockdev writev readv block ...passed 00:09:44.864 Test: blockdev writev readv size > 128k ...passed 00:09:44.864 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.864 Test: blockdev comparev and writev ...passed 00:09:44.864 Test: blockdev nvme passthru rw ...passed 00:09:44.864 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.864 Test: blockdev nvme admin passthru ...passed 00:09:44.864 Test: blockdev copy ...passed 00:09:44.864 Suite: bdevio tests on: Malloc2p4 00:09:44.864 Test: blockdev write read block ...passed 00:09:44.864 Test: blockdev write zeroes read block ...passed 00:09:44.864 Test: blockdev write zeroes read no split ...passed 00:09:44.864 Test: blockdev write zeroes read split ...passed 00:09:44.864 Test: blockdev write zeroes read split partial ...passed 00:09:44.864 Test: blockdev reset ...passed 00:09:44.864 Test: blockdev write read 8 blocks ...passed 00:09:44.864 Test: blockdev write read size > 128k ...passed 00:09:44.864 Test: blockdev write read invalid size ...passed 00:09:44.864 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.864 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.864 Test: blockdev write read max offset ...passed 00:09:44.864 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.864 Test: blockdev writev readv 8 blocks ...passed 00:09:44.864 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.864 Test: blockdev writev readv block ...passed 00:09:44.864 Test: blockdev writev readv size > 128k ...passed 00:09:44.864 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.864 Test: blockdev comparev and writev ...passed 00:09:44.864 Test: blockdev nvme passthru rw ...passed 00:09:44.864 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.864 Test: blockdev nvme admin passthru ...passed 00:09:44.864 Test: blockdev copy ...passed 00:09:44.864 Suite: bdevio tests on: Malloc2p3 00:09:44.864 Test: blockdev write read block ...passed 00:09:44.864 Test: blockdev write zeroes read block ...passed 00:09:44.864 Test: blockdev write zeroes read no split ...passed 00:09:44.864 Test: blockdev write zeroes read split ...passed 00:09:44.864 Test: blockdev write zeroes read split partial ...passed 00:09:44.864 Test: blockdev reset ...passed 00:09:44.864 Test: blockdev write read 8 blocks ...passed 00:09:44.864 Test: blockdev write read size > 128k ...passed 00:09:44.864 Test: blockdev write read invalid size ...passed 00:09:44.864 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.864 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.864 Test: blockdev write read max offset ...passed 00:09:44.864 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.864 Test: blockdev writev readv 8 blocks ...passed 00:09:44.864 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.864 Test: blockdev writev readv block ...passed 00:09:44.864 Test: blockdev writev readv size > 128k ...passed 00:09:44.864 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.864 Test: blockdev comparev and writev ...passed 00:09:44.864 Test: blockdev nvme passthru rw ...passed 00:09:44.864 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.864 Test: blockdev nvme admin passthru ...passed 00:09:44.864 Test: blockdev copy ...passed 00:09:44.864 Suite: bdevio tests on: Malloc2p2 00:09:44.864 Test: blockdev write read block ...passed 00:09:44.864 Test: blockdev write zeroes read block ...passed 00:09:44.864 Test: blockdev write zeroes read no split ...passed 00:09:44.864 Test: blockdev write zeroes read split ...passed 00:09:44.864 Test: blockdev write zeroes read split partial ...passed 00:09:44.864 Test: blockdev reset ...passed 00:09:44.864 Test: blockdev write read 8 blocks ...passed 00:09:44.864 Test: blockdev write read size > 128k ...passed 00:09:44.864 Test: blockdev write read invalid size ...passed 00:09:44.864 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:44.864 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:44.864 Test: blockdev write read max offset ...passed 00:09:44.864 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:44.864 Test: blockdev writev readv 8 blocks ...passed 00:09:44.864 Test: blockdev writev readv 30 x 1block ...passed 00:09:44.864 Test: blockdev writev readv block ...passed 00:09:44.864 Test: blockdev writev readv size > 128k ...passed 00:09:44.864 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:44.864 Test: blockdev comparev and writev ...passed 00:09:44.864 Test: blockdev nvme passthru rw ...passed 00:09:44.864 Test: blockdev nvme passthru vendor specific ...passed 00:09:44.864 Test: blockdev nvme admin passthru ...passed 00:09:44.864 Test: blockdev copy ...passed 00:09:44.864 Suite: bdevio tests on: Malloc2p1 00:09:44.864 Test: blockdev write read block ...passed 00:09:44.864 Test: blockdev write zeroes read block ...passed 00:09:44.864 Test: blockdev write zeroes read no split ...passed 00:09:45.124 Test: blockdev write zeroes read split ...passed 00:09:45.124 Test: blockdev write zeroes read split partial ...passed 00:09:45.124 Test: blockdev reset ...passed 00:09:45.124 Test: blockdev write read 8 blocks ...passed 00:09:45.124 Test: blockdev write read size > 128k ...passed 00:09:45.124 Test: blockdev write read invalid size ...passed 00:09:45.124 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:45.124 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:45.124 Test: blockdev write read max offset ...passed 00:09:45.124 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:45.124 Test: blockdev writev readv 8 blocks ...passed 00:09:45.124 Test: blockdev writev readv 30 x 1block ...passed 00:09:45.124 Test: blockdev writev readv block ...passed 00:09:45.124 Test: blockdev writev readv size > 128k ...passed 00:09:45.124 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:45.124 Test: blockdev comparev and writev ...passed 00:09:45.124 Test: blockdev nvme passthru rw ...passed 00:09:45.124 Test: blockdev nvme passthru vendor specific ...passed 00:09:45.124 Test: blockdev nvme admin passthru ...passed 00:09:45.124 Test: blockdev copy ...passed 00:09:45.124 Suite: bdevio tests on: Malloc2p0 00:09:45.124 Test: blockdev write read block ...passed 00:09:45.124 Test: blockdev write zeroes read block ...passed 00:09:45.124 Test: blockdev write zeroes read no split ...passed 00:09:45.124 Test: blockdev write zeroes read split ...passed 00:09:45.124 Test: blockdev write zeroes read split partial ...passed 00:09:45.124 Test: blockdev reset ...passed 00:09:45.124 Test: blockdev write read 8 blocks ...passed 00:09:45.124 Test: blockdev write read size > 128k ...passed 00:09:45.124 Test: blockdev write read invalid size ...passed 00:09:45.124 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:45.124 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:45.124 Test: blockdev write read max offset ...passed 00:09:45.124 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:45.124 Test: blockdev writev readv 8 blocks ...passed 00:09:45.124 Test: blockdev writev readv 30 x 1block ...passed 00:09:45.124 Test: blockdev writev readv block ...passed 00:09:45.124 Test: blockdev writev readv size > 128k ...passed 00:09:45.124 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:45.124 Test: blockdev comparev and writev ...passed 00:09:45.124 Test: blockdev nvme passthru rw ...passed 00:09:45.124 Test: blockdev nvme passthru vendor specific ...passed 00:09:45.124 Test: blockdev nvme admin passthru ...passed 00:09:45.124 Test: blockdev copy ...passed 00:09:45.124 Suite: bdevio tests on: Malloc1p1 00:09:45.124 Test: blockdev write read block ...passed 00:09:45.124 Test: blockdev write zeroes read block ...passed 00:09:45.124 Test: blockdev write zeroes read no split ...passed 00:09:45.124 Test: blockdev write zeroes read split ...passed 00:09:45.124 Test: blockdev write zeroes read split partial ...passed 00:09:45.124 Test: blockdev reset ...passed 00:09:45.124 Test: blockdev write read 8 blocks ...passed 00:09:45.124 Test: blockdev write read size > 128k ...passed 00:09:45.124 Test: blockdev write read invalid size ...passed 00:09:45.124 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:45.124 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:45.124 Test: blockdev write read max offset ...passed 00:09:45.124 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:45.124 Test: blockdev writev readv 8 blocks ...passed 00:09:45.124 Test: blockdev writev readv 30 x 1block ...passed 00:09:45.124 Test: blockdev writev readv block ...passed 00:09:45.124 Test: blockdev writev readv size > 128k ...passed 00:09:45.124 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:45.124 Test: blockdev comparev and writev ...passed 00:09:45.124 Test: blockdev nvme passthru rw ...passed 00:09:45.124 Test: blockdev nvme passthru vendor specific ...passed 00:09:45.124 Test: blockdev nvme admin passthru ...passed 00:09:45.124 Test: blockdev copy ...passed 00:09:45.124 Suite: bdevio tests on: Malloc1p0 00:09:45.124 Test: blockdev write read block ...passed 00:09:45.124 Test: blockdev write zeroes read block ...passed 00:09:45.124 Test: blockdev write zeroes read no split ...passed 00:09:45.124 Test: blockdev write zeroes read split ...passed 00:09:45.124 Test: blockdev write zeroes read split partial ...passed 00:09:45.124 Test: blockdev reset ...passed 00:09:45.125 Test: blockdev write read 8 blocks ...passed 00:09:45.125 Test: blockdev write read size > 128k ...passed 00:09:45.125 Test: blockdev write read invalid size ...passed 00:09:45.125 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:45.125 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:45.125 Test: blockdev write read max offset ...passed 00:09:45.125 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:45.125 Test: blockdev writev readv 8 blocks ...passed 00:09:45.125 Test: blockdev writev readv 30 x 1block ...passed 00:09:45.125 Test: blockdev writev readv block ...passed 00:09:45.125 Test: blockdev writev readv size > 128k ...passed 00:09:45.125 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:45.125 Test: blockdev comparev and writev ...passed 00:09:45.125 Test: blockdev nvme passthru rw ...passed 00:09:45.125 Test: blockdev nvme passthru vendor specific ...passed 00:09:45.125 Test: blockdev nvme admin passthru ...passed 00:09:45.125 Test: blockdev copy ...passed 00:09:45.125 Suite: bdevio tests on: Malloc0 00:09:45.125 Test: blockdev write read block ...passed 00:09:45.125 Test: blockdev write zeroes read block ...passed 00:09:45.125 Test: blockdev write zeroes read no split ...passed 00:09:45.125 Test: blockdev write zeroes read split ...passed 00:09:45.125 Test: blockdev write zeroes read split partial ...passed 00:09:45.125 Test: blockdev reset ...passed 00:09:45.125 Test: blockdev write read 8 blocks ...passed 00:09:45.125 Test: blockdev write read size > 128k ...passed 00:09:45.125 Test: blockdev write read invalid size ...passed 00:09:45.125 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:45.125 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:45.125 Test: blockdev write read max offset ...passed 00:09:45.125 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:45.125 Test: blockdev writev readv 8 blocks ...passed 00:09:45.125 Test: blockdev writev readv 30 x 1block ...passed 00:09:45.125 Test: blockdev writev readv block ...passed 00:09:45.125 Test: blockdev writev readv size > 128k ...passed 00:09:45.125 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:45.125 Test: blockdev comparev and writev ...passed 00:09:45.125 Test: blockdev nvme passthru rw ...passed 00:09:45.125 Test: blockdev nvme passthru vendor specific ...passed 00:09:45.125 Test: blockdev nvme admin passthru ...passed 00:09:45.125 Test: blockdev copy ...passed 00:09:45.125 00:09:45.125 Run Summary: Type Total Ran Passed Failed Inactive 00:09:45.125 suites 16 16 n/a 0 0 00:09:45.125 tests 368 368 368 0 0 00:09:45.125 asserts 2224 2224 2224 0 n/a 00:09:45.125 00:09:45.125 Elapsed time = 0.496 seconds 00:09:45.125 0 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3402844 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3402844 ']' 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3402844 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3402844 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3402844' 00:09:45.125 killing process with pid 3402844 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3402844 00:09:45.125 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3402844 00:09:45.384 22:16:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:09:45.384 00:09:45.384 real 0m1.711s 00:09:45.384 user 0m3.975s 00:09:45.384 sys 0m0.561s 00:09:45.384 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:45.384 22:16:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:45.384 ************************************ 00:09:45.384 END TEST bdev_bounds 00:09:45.384 ************************************ 00:09:45.384 22:16:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:45.385 22:16:55 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:45.385 22:16:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:45.385 22:16:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:45.385 22:16:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:45.385 ************************************ 00:09:45.385 START TEST bdev_nbd 00:09:45.385 ************************************ 00:09:45.385 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3403057 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3403057 /var/tmp/spdk-nbd.sock 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3403057 ']' 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:45.654 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:45.654 22:16:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:45.654 [2024-07-12 22:16:55.783122] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:09:45.654 [2024-07-12 22:16:55.783198] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:45.654 [2024-07-12 22:16:55.916149] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:45.912 [2024-07-12 22:16:56.018334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.912 [2024-07-12 22:16:56.178506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:45.912 [2024-07-12 22:16:56.178579] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:45.913 [2024-07-12 22:16:56.178599] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:45.913 [2024-07-12 22:16:56.186513] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:45.913 [2024-07-12 22:16:56.186545] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:45.913 [2024-07-12 22:16:56.194523] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:45.913 [2024-07-12 22:16:56.194553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:46.173 [2024-07-12 22:16:56.272082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:46.173 [2024-07-12 22:16:56.272141] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:46.173 [2024-07-12 22:16:56.272165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2147a40 00:09:46.173 [2024-07-12 22:16:56.272185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:46.173 [2024-07-12 22:16:56.273679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:46.173 [2024-07-12 22:16:56.273714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:46.432 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:46.722 1+0 records in 00:09:46.722 1+0 records out 00:09:46.722 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247665 s, 16.5 MB/s 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:46.722 22:16:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:46.980 1+0 records in 00:09:46.980 1+0 records out 00:09:46.980 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283235 s, 14.5 MB/s 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:46.980 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:47.238 1+0 records in 00:09:47.238 1+0 records out 00:09:47.238 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341667 s, 12.0 MB/s 00:09:47.238 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:47.496 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:47.754 1+0 records in 00:09:47.754 1+0 records out 00:09:47.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000376507 s, 10.9 MB/s 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:47.754 22:16:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.013 1+0 records in 00:09:48.013 1+0 records out 00:09:48.013 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290499 s, 14.1 MB/s 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:48.013 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.272 1+0 records in 00:09:48.272 1+0 records out 00:09:48.272 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000435402 s, 9.4 MB/s 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:48.272 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.531 1+0 records in 00:09:48.531 1+0 records out 00:09:48.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034659 s, 11.8 MB/s 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:48.531 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.789 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.790 1+0 records in 00:09:48.790 1+0 records out 00:09:48.790 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000547408 s, 7.5 MB/s 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:48.790 22:16:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.065 1+0 records in 00:09:49.065 1+0 records out 00:09:49.065 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000440977 s, 9.3 MB/s 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.065 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.323 1+0 records in 00:09:49.323 1+0 records out 00:09:49.323 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000561178 s, 7.3 MB/s 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:49.323 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.582 1+0 records in 00:09:49.582 1+0 records out 00:09:49.582 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000595614 s, 6.9 MB/s 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.582 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.582 1+0 records in 00:09:49.582 1+0 records out 00:09:49.582 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000524899 s, 7.8 MB/s 00:09:49.840 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.840 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.840 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.840 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.840 22:16:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.841 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:49.841 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:49.841 22:16:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.841 1+0 records in 00:09:49.841 1+0 records out 00:09:49.841 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000584883 s, 7.0 MB/s 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:49.841 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.099 1+0 records in 00:09:50.099 1+0 records out 00:09:50.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000597273 s, 6.9 MB/s 00:09:50.099 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:50.356 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.614 1+0 records in 00:09:50.614 1+0 records out 00:09:50.614 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000547769 s, 7.5 MB/s 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:50.614 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.873 1+0 records in 00:09:50.873 1+0 records out 00:09:50.873 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000664019 s, 6.2 MB/s 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:50.873 22:17:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd0", 00:09:51.131 "bdev_name": "Malloc0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd1", 00:09:51.131 "bdev_name": "Malloc1p0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd2", 00:09:51.131 "bdev_name": "Malloc1p1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd3", 00:09:51.131 "bdev_name": "Malloc2p0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd4", 00:09:51.131 "bdev_name": "Malloc2p1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd5", 00:09:51.131 "bdev_name": "Malloc2p2" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd6", 00:09:51.131 "bdev_name": "Malloc2p3" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd7", 00:09:51.131 "bdev_name": "Malloc2p4" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd8", 00:09:51.131 "bdev_name": "Malloc2p5" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd9", 00:09:51.131 "bdev_name": "Malloc2p6" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd10", 00:09:51.131 "bdev_name": "Malloc2p7" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd11", 00:09:51.131 "bdev_name": "TestPT" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd12", 00:09:51.131 "bdev_name": "raid0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd13", 00:09:51.131 "bdev_name": "concat0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd14", 00:09:51.131 "bdev_name": "raid1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd15", 00:09:51.131 "bdev_name": "AIO0" 00:09:51.131 } 00:09:51.131 ]' 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd0", 00:09:51.131 "bdev_name": "Malloc0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd1", 00:09:51.131 "bdev_name": "Malloc1p0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd2", 00:09:51.131 "bdev_name": "Malloc1p1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd3", 00:09:51.131 "bdev_name": "Malloc2p0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd4", 00:09:51.131 "bdev_name": "Malloc2p1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd5", 00:09:51.131 "bdev_name": "Malloc2p2" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd6", 00:09:51.131 "bdev_name": "Malloc2p3" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd7", 00:09:51.131 "bdev_name": "Malloc2p4" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd8", 00:09:51.131 "bdev_name": "Malloc2p5" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd9", 00:09:51.131 "bdev_name": "Malloc2p6" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd10", 00:09:51.131 "bdev_name": "Malloc2p7" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd11", 00:09:51.131 "bdev_name": "TestPT" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd12", 00:09:51.131 "bdev_name": "raid0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd13", 00:09:51.131 "bdev_name": "concat0" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd14", 00:09:51.131 "bdev_name": "raid1" 00:09:51.131 }, 00:09:51.131 { 00:09:51.131 "nbd_device": "/dev/nbd15", 00:09:51.131 "bdev_name": "AIO0" 00:09:51.131 } 00:09:51.131 ]' 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:51.131 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:51.389 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:51.648 22:17:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:51.906 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:52.163 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:52.163 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:52.163 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:52.163 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:52.164 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:52.422 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:52.680 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:52.681 22:17:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:52.939 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:53.198 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:53.457 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:53.716 22:17:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:53.975 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:54.233 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:54.491 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:54.492 22:17:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:54.750 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:54.750 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:54.750 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:54.750 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:55.009 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:55.267 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:55.527 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:55.786 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:55.787 22:17:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:56.046 /dev/nbd0 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:56.046 1+0 records in 00:09:56.046 1+0 records out 00:09:56.046 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259617 s, 15.8 MB/s 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:56.046 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:56.305 /dev/nbd1 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:56.305 1+0 records in 00:09:56.305 1+0 records out 00:09:56.305 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232628 s, 17.6 MB/s 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:56.305 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:56.564 /dev/nbd10 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:56.564 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:56.564 1+0 records in 00:09:56.564 1+0 records out 00:09:56.564 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295455 s, 13.9 MB/s 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:56.565 22:17:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:56.823 /dev/nbd11 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:56.823 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:56.823 1+0 records in 00:09:56.823 1+0 records out 00:09:56.823 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354863 s, 11.5 MB/s 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:56.824 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:57.082 /dev/nbd12 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:57.082 1+0 records in 00:09:57.082 1+0 records out 00:09:57.082 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277933 s, 14.7 MB/s 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:57.082 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:57.341 /dev/nbd13 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:57.341 1+0 records in 00:09:57.341 1+0 records out 00:09:57.341 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000404685 s, 10.1 MB/s 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:57.341 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:57.600 /dev/nbd14 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:57.600 1+0 records in 00:09:57.600 1+0 records out 00:09:57.600 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000459807 s, 8.9 MB/s 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:57.600 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:57.601 22:17:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:57.859 /dev/nbd15 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:57.859 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:58.118 1+0 records in 00:09:58.118 1+0 records out 00:09:58.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000466601 s, 8.8 MB/s 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:58.118 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:58.118 /dev/nbd2 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:58.377 1+0 records in 00:09:58.377 1+0 records out 00:09:58.377 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000539583 s, 7.6 MB/s 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:58.377 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:58.636 /dev/nbd3 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:58.636 1+0 records in 00:09:58.636 1+0 records out 00:09:58.636 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000621559 s, 6.6 MB/s 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:58.636 22:17:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:58.895 /dev/nbd4 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:58.895 1+0 records in 00:09:58.895 1+0 records out 00:09:58.895 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000594479 s, 6.9 MB/s 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:58.895 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:59.155 /dev/nbd5 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:59.155 1+0 records in 00:09:59.155 1+0 records out 00:09:59.155 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000634202 s, 6.5 MB/s 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:59.155 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:59.415 /dev/nbd6 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:59.415 1+0 records in 00:09:59.415 1+0 records out 00:09:59.415 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000685514 s, 6.0 MB/s 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:59.415 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:59.674 /dev/nbd7 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:59.674 1+0 records in 00:09:59.674 1+0 records out 00:09:59.674 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000750514 s, 5.5 MB/s 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:59.674 22:17:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:59.933 /dev/nbd8 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:59.933 1+0 records in 00:09:59.933 1+0 records out 00:09:59.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000602128 s, 6.8 MB/s 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:59.933 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:10:00.192 /dev/nbd9 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:00.192 1+0 records in 00:10:00.192 1+0 records out 00:10:00.192 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00072007 s, 5.7 MB/s 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:00.192 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:00.451 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd0", 00:10:00.451 "bdev_name": "Malloc0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd1", 00:10:00.451 "bdev_name": "Malloc1p0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd10", 00:10:00.451 "bdev_name": "Malloc1p1" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd11", 00:10:00.451 "bdev_name": "Malloc2p0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd12", 00:10:00.451 "bdev_name": "Malloc2p1" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd13", 00:10:00.451 "bdev_name": "Malloc2p2" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd14", 00:10:00.451 "bdev_name": "Malloc2p3" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd15", 00:10:00.451 "bdev_name": "Malloc2p4" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd2", 00:10:00.451 "bdev_name": "Malloc2p5" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd3", 00:10:00.451 "bdev_name": "Malloc2p6" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd4", 00:10:00.451 "bdev_name": "Malloc2p7" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd5", 00:10:00.451 "bdev_name": "TestPT" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd6", 00:10:00.451 "bdev_name": "raid0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd7", 00:10:00.451 "bdev_name": "concat0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd8", 00:10:00.451 "bdev_name": "raid1" 00:10:00.451 }, 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd9", 00:10:00.451 "bdev_name": "AIO0" 00:10:00.451 } 00:10:00.451 ]' 00:10:00.451 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:00.451 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:00.451 { 00:10:00.451 "nbd_device": "/dev/nbd0", 00:10:00.451 "bdev_name": "Malloc0" 00:10:00.451 }, 00:10:00.451 { 00:10:00.452 "nbd_device": "/dev/nbd1", 00:10:00.452 "bdev_name": "Malloc1p0" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd10", 00:10:00.452 "bdev_name": "Malloc1p1" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd11", 00:10:00.452 "bdev_name": "Malloc2p0" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd12", 00:10:00.452 "bdev_name": "Malloc2p1" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd13", 00:10:00.452 "bdev_name": "Malloc2p2" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd14", 00:10:00.452 "bdev_name": "Malloc2p3" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd15", 00:10:00.452 "bdev_name": "Malloc2p4" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd2", 00:10:00.452 "bdev_name": "Malloc2p5" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd3", 00:10:00.452 "bdev_name": "Malloc2p6" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd4", 00:10:00.452 "bdev_name": "Malloc2p7" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd5", 00:10:00.452 "bdev_name": "TestPT" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd6", 00:10:00.452 "bdev_name": "raid0" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd7", 00:10:00.452 "bdev_name": "concat0" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd8", 00:10:00.452 "bdev_name": "raid1" 00:10:00.452 }, 00:10:00.452 { 00:10:00.452 "nbd_device": "/dev/nbd9", 00:10:00.452 "bdev_name": "AIO0" 00:10:00.452 } 00:10:00.452 ]' 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:00.452 /dev/nbd1 00:10:00.452 /dev/nbd10 00:10:00.452 /dev/nbd11 00:10:00.452 /dev/nbd12 00:10:00.452 /dev/nbd13 00:10:00.452 /dev/nbd14 00:10:00.452 /dev/nbd15 00:10:00.452 /dev/nbd2 00:10:00.452 /dev/nbd3 00:10:00.452 /dev/nbd4 00:10:00.452 /dev/nbd5 00:10:00.452 /dev/nbd6 00:10:00.452 /dev/nbd7 00:10:00.452 /dev/nbd8 00:10:00.452 /dev/nbd9' 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:00.452 /dev/nbd1 00:10:00.452 /dev/nbd10 00:10:00.452 /dev/nbd11 00:10:00.452 /dev/nbd12 00:10:00.452 /dev/nbd13 00:10:00.452 /dev/nbd14 00:10:00.452 /dev/nbd15 00:10:00.452 /dev/nbd2 00:10:00.452 /dev/nbd3 00:10:00.452 /dev/nbd4 00:10:00.452 /dev/nbd5 00:10:00.452 /dev/nbd6 00:10:00.452 /dev/nbd7 00:10:00.452 /dev/nbd8 00:10:00.452 /dev/nbd9' 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:10:00.452 256+0 records in 00:10:00.452 256+0 records out 00:10:00.452 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106943 s, 98.1 MB/s 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:00.452 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:00.711 256+0 records in 00:10:00.711 256+0 records out 00:10:00.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.170985 s, 6.1 MB/s 00:10:00.711 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:00.712 22:17:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:00.970 256+0 records in 00:10:00.970 256+0 records out 00:10:00.970 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.176864 s, 5.9 MB/s 00:10:00.970 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:00.970 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:10:00.970 256+0 records in 00:10:00.970 256+0 records out 00:10:00.970 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.144807 s, 7.2 MB/s 00:10:00.970 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:00.970 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:10:01.230 256+0 records in 00:10:01.230 256+0 records out 00:10:01.230 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.141713 s, 7.4 MB/s 00:10:01.230 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:01.230 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:10:01.490 256+0 records in 00:10:01.490 256+0 records out 00:10:01.490 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.146234 s, 7.2 MB/s 00:10:01.490 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:01.490 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:10:01.490 256+0 records in 00:10:01.490 256+0 records out 00:10:01.490 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182397 s, 5.7 MB/s 00:10:01.490 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:01.490 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:10:01.749 256+0 records in 00:10:01.749 256+0 records out 00:10:01.749 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183321 s, 5.7 MB/s 00:10:01.749 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:01.750 22:17:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:10:02.011 256+0 records in 00:10:02.011 256+0 records out 00:10:02.011 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183891 s, 5.7 MB/s 00:10:02.011 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.011 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:10:02.011 256+0 records in 00:10:02.011 256+0 records out 00:10:02.011 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181796 s, 5.8 MB/s 00:10:02.011 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.011 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:10:02.308 256+0 records in 00:10:02.309 256+0 records out 00:10:02.309 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18246 s, 5.7 MB/s 00:10:02.309 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.309 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:10:02.567 256+0 records in 00:10:02.567 256+0 records out 00:10:02.567 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.136057 s, 7.7 MB/s 00:10:02.567 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.568 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:10:02.568 256+0 records in 00:10:02.568 256+0 records out 00:10:02.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.11698 s, 9.0 MB/s 00:10:02.568 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.568 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:10:02.827 256+0 records in 00:10:02.827 256+0 records out 00:10:02.827 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.179267 s, 5.8 MB/s 00:10:02.827 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.827 22:17:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:10:02.827 256+0 records in 00:10:02.827 256+0 records out 00:10:02.827 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186446 s, 5.6 MB/s 00:10:02.827 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:02.827 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:10:03.086 256+0 records in 00:10:03.086 256+0 records out 00:10:03.086 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186754 s, 5.6 MB/s 00:10:03.086 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:03.086 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:10:03.346 256+0 records in 00:10:03.347 256+0 records out 00:10:03.347 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182538 s, 5.7 MB/s 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:03.347 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:03.607 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:03.866 22:17:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.126 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.385 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.644 22:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.904 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.163 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.423 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.683 22:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.942 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.202 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:06.461 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:06.461 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:06.461 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:06.461 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.462 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.721 22:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.980 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.239 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.498 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:07.757 22:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:10:08.016 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:10:08.275 malloc_lvol_verify 00:10:08.275 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:10:08.534 1af0cec7-26b8-40ae-80c0-8cb64cb242d8 00:10:08.534 22:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:10:08.792 7b77a315-332a-4011-be17-57be676680a8 00:10:08.792 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:10:09.051 /dev/nbd0 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:10:09.051 mke2fs 1.46.5 (30-Dec-2021) 00:10:09.051 Discarding device blocks: 0/4096 done 00:10:09.051 Creating filesystem with 4096 1k blocks and 1024 inodes 00:10:09.051 00:10:09.051 Allocating group tables: 0/1 done 00:10:09.051 Writing inode tables: 0/1 done 00:10:09.051 Creating journal (1024 blocks): done 00:10:09.051 Writing superblocks and filesystem accounting information: 0/1 done 00:10:09.051 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:09.051 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3403057 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3403057 ']' 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3403057 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3403057 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3403057' 00:10:09.310 killing process with pid 3403057 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3403057 00:10:09.310 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3403057 00:10:09.879 22:17:19 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:10:09.879 00:10:09.879 real 0m24.227s 00:10:09.879 user 0m29.518s 00:10:09.879 sys 0m14.371s 00:10:09.879 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:09.879 22:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:09.879 ************************************ 00:10:09.879 END TEST bdev_nbd 00:10:09.879 ************************************ 00:10:09.879 22:17:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:09.879 22:17:19 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:10:09.879 22:17:19 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:10:09.879 22:17:19 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:10:09.879 22:17:19 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:10:09.879 22:17:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:09.879 22:17:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.879 22:17:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:09.879 ************************************ 00:10:09.879 START TEST bdev_fio 00:10:09.879 ************************************ 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:09.879 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:10:09.879 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:10:09.880 22:17:20 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:09.880 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:09.880 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.880 22:17:20 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:09.880 ************************************ 00:10:09.880 START TEST bdev_fio_rw_verify 00:10:09.880 ************************************ 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:09.880 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:10.147 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:10.147 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:10.147 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:10.147 22:17:20 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:10.405 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:10.405 fio-3.35 00:10:10.405 Starting 16 threads 00:10:22.599 00:10:22.599 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=3407071: Fri Jul 12 22:17:31 2024 00:10:22.599 read: IOPS=86.7k, BW=338MiB/s (355MB/s)(3385MiB/10001msec) 00:10:22.599 slat (nsec): min=1895, max=242555, avg=37070.25, stdev=14843.76 00:10:22.599 clat (usec): min=12, max=1192, avg=306.14, stdev=134.06 00:10:22.599 lat (usec): min=24, max=1244, avg=343.21, stdev=141.63 00:10:22.599 clat percentiles (usec): 00:10:22.599 | 50.000th=[ 302], 99.000th=[ 586], 99.900th=[ 652], 99.990th=[ 898], 00:10:22.599 | 99.999th=[ 979] 00:10:22.599 write: IOPS=135k, BW=529MiB/s (555MB/s)(5221MiB/9870msec); 0 zone resets 00:10:22.599 slat (usec): min=8, max=4690, avg=50.64, stdev=15.84 00:10:22.599 clat (usec): min=11, max=2190, avg=361.38, stdev=159.25 00:10:22.599 lat (usec): min=37, max=5713, avg=412.02, stdev=166.72 00:10:22.599 clat percentiles (usec): 00:10:22.599 | 50.000th=[ 347], 99.000th=[ 775], 99.900th=[ 955], 99.990th=[ 1029], 00:10:22.599 | 99.999th=[ 1811] 00:10:22.599 bw ( KiB/s): min=463568, max=708356, per=98.91%, avg=535766.53, stdev=3528.68, samples=304 00:10:22.599 iops : min=115892, max=177088, avg=133941.58, stdev=882.16, samples=304 00:10:22.599 lat (usec) : 20=0.01%, 50=0.33%, 100=3.51%, 250=28.30%, 500=51.84% 00:10:22.599 lat (usec) : 750=15.31%, 1000=0.68% 00:10:22.599 lat (msec) : 2=0.01%, 4=0.01% 00:10:22.599 cpu : usr=99.22%, sys=0.38%, ctx=601, majf=0, minf=2748 00:10:22.599 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:22.599 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:22.599 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:22.599 issued rwts: total=866628,1336637,0,0 short=0,0,0,0 dropped=0,0,0,0 00:10:22.599 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:22.599 00:10:22.599 Run status group 0 (all jobs): 00:10:22.599 READ: bw=338MiB/s (355MB/s), 338MiB/s-338MiB/s (355MB/s-355MB/s), io=3385MiB (3550MB), run=10001-10001msec 00:10:22.599 WRITE: bw=529MiB/s (555MB/s), 529MiB/s-529MiB/s (555MB/s-555MB/s), io=5221MiB (5475MB), run=9870-9870msec 00:10:22.599 00:10:22.599 real 0m11.469s 00:10:22.599 user 2m44.608s 00:10:22.599 sys 0m1.167s 00:10:22.599 22:17:31 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:22.599 22:17:31 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:10:22.599 ************************************ 00:10:22.599 END TEST bdev_fio_rw_verify 00:10:22.599 ************************************ 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:10:22.599 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:22.600 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ae196e9f-543d-41a6-bffa-6a68ff3a6532"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae196e9f-543d-41a6-bffa-6a68ff3a6532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e36572d6-d4e9-5019-99fd-98556fa1c2b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e36572d6-d4e9-5019-99fd-98556fa1c2b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e4d42630-ec49-5c20-8b43-d259fe7da222"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e4d42630-ec49-5c20-8b43-d259fe7da222",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "202e34ce-32e9-5aee-8b80-bbdcbcf72816"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "202e34ce-32e9-5aee-8b80-bbdcbcf72816",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e22f1047-8289-57e0-880d-fb66f2ebbf2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e22f1047-8289-57e0-880d-fb66f2ebbf2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "2f6cb6b0-6109-5343-a3cf-1abc380217f1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2f6cb6b0-6109-5343-a3cf-1abc380217f1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cb275119-56e3-5141-b0a1-7c6fc29be7d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cb275119-56e3-5141-b0a1-7c6fc29be7d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1c5ff529-2d37-5e37-8979-4472172373d6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1c5ff529-2d37-5e37-8979-4472172373d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "e04a7d64-17ea-5cb3-969f-4768e1852c9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e04a7d64-17ea-5cb3-969f-4768e1852c9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "82207083-d649-5185-bc0a-8944c83f132a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "82207083-d649-5185-bc0a-8944c83f132a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "b131dffa-6b7f-5914-9b5e-81fb928f2687"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b131dffa-6b7f-5914-9b5e-81fb928f2687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "49b1f3aa-a753-4e29-ba43-8b9c6d056607"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1b56e5d5-60c9-4d89-9181-84e05ad68d59",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5ec7940d-9f51-4001-b493-f7ab5dffcfed",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fa972a55-2c46-4da1-b5aa-afd16cfc527d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c1fcf652-f04c-4416-9ab6-e0cf63c3ba6c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9d438139-ec64-495a-8784-b5f5738b985c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "9fa6c3e3-e845-4e31-b9b0-7c6614842f68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "85260974-9020-4e32-8fa2-e17299f382a1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "71ed0ede-a550-4bb7-a37c-e69dce76ae18"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "71ed0ede-a550-4bb7-a37c-e69dce76ae18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:22.601 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:10:22.601 Malloc1p0 00:10:22.601 Malloc1p1 00:10:22.601 Malloc2p0 00:10:22.601 Malloc2p1 00:10:22.601 Malloc2p2 00:10:22.601 Malloc2p3 00:10:22.601 Malloc2p4 00:10:22.601 Malloc2p5 00:10:22.601 Malloc2p6 00:10:22.601 Malloc2p7 00:10:22.601 TestPT 00:10:22.601 raid0 00:10:22.601 concat0 ]] 00:10:22.601 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ae196e9f-543d-41a6-bffa-6a68ff3a6532"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae196e9f-543d-41a6-bffa-6a68ff3a6532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "e36572d6-d4e9-5019-99fd-98556fa1c2b7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e36572d6-d4e9-5019-99fd-98556fa1c2b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5b9ee6f0-fd32-5a78-adb2-6a1fba2116b6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e4d42630-ec49-5c20-8b43-d259fe7da222"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e4d42630-ec49-5c20-8b43-d259fe7da222",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "202e34ce-32e9-5aee-8b80-bbdcbcf72816"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "202e34ce-32e9-5aee-8b80-bbdcbcf72816",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e22f1047-8289-57e0-880d-fb66f2ebbf2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e22f1047-8289-57e0-880d-fb66f2ebbf2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "2f6cb6b0-6109-5343-a3cf-1abc380217f1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2f6cb6b0-6109-5343-a3cf-1abc380217f1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "cb275119-56e3-5141-b0a1-7c6fc29be7d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cb275119-56e3-5141-b0a1-7c6fc29be7d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1c5ff529-2d37-5e37-8979-4472172373d6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1c5ff529-2d37-5e37-8979-4472172373d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "e04a7d64-17ea-5cb3-969f-4768e1852c9d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e04a7d64-17ea-5cb3-969f-4768e1852c9d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "82207083-d649-5185-bc0a-8944c83f132a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "82207083-d649-5185-bc0a-8944c83f132a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "b131dffa-6b7f-5914-9b5e-81fb928f2687"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b131dffa-6b7f-5914-9b5e-81fb928f2687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "49b1f3aa-a753-4e29-ba43-8b9c6d056607"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "49b1f3aa-a753-4e29-ba43-8b9c6d056607",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "1b56e5d5-60c9-4d89-9181-84e05ad68d59",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5ec7940d-9f51-4001-b493-f7ab5dffcfed",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fa972a55-2c46-4da1-b5aa-afd16cfc527d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fa972a55-2c46-4da1-b5aa-afd16cfc527d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c1fcf652-f04c-4416-9ab6-e0cf63c3ba6c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "9d438139-ec64-495a-8784-b5f5738b985c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8ab30bb2-c4eb-4b4b-b6c1-ab96eaa674a3",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "9fa6c3e3-e845-4e31-b9b0-7c6614842f68",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "85260974-9020-4e32-8fa2-e17299f382a1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "71ed0ede-a550-4bb7-a37c-e69dce76ae18"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "71ed0ede-a550-4bb7-a37c-e69dce76ae18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:22.602 22:17:31 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:22.602 ************************************ 00:10:22.602 START TEST bdev_fio_trim 00:10:22.602 ************************************ 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:22.602 22:17:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:22.603 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:22.603 fio-3.35 00:10:22.603 Starting 14 threads 00:10:34.793 00:10:34.793 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=3408771: Fri Jul 12 22:17:42 2024 00:10:34.793 write: IOPS=139k, BW=543MiB/s (570MB/s)(5435MiB/10002msec); 0 zone resets 00:10:34.793 slat (nsec): min=1972, max=899610, avg=35640.66, stdev=10591.82 00:10:34.793 clat (usec): min=15, max=3795, avg=251.25, stdev=91.40 00:10:34.793 lat (usec): min=24, max=3813, avg=286.89, stdev=95.79 00:10:34.793 clat percentiles (usec): 00:10:34.793 | 50.000th=[ 241], 99.000th=[ 474], 99.900th=[ 537], 99.990th=[ 685], 00:10:34.793 | 99.999th=[ 1188] 00:10:34.793 bw ( KiB/s): min=462336, max=740075, per=100.00%, avg=560625.00, stdev=6338.07, samples=266 00:10:34.793 iops : min=115584, max=185016, avg=140156.11, stdev=1584.51, samples=266 00:10:34.793 trim: IOPS=139k, BW=543MiB/s (570MB/s)(5435MiB/10002msec); 0 zone resets 00:10:34.793 slat (usec): min=3, max=188, avg=24.25, stdev= 6.68 00:10:34.793 clat (usec): min=3, max=3813, avg=284.51, stdev=97.76 00:10:34.793 lat (usec): min=8, max=3831, avg=308.76, stdev=101.01 00:10:34.793 clat percentiles (usec): 00:10:34.793 | 50.000th=[ 277], 99.000th=[ 519], 99.900th=[ 578], 99.990th=[ 627], 00:10:34.793 | 99.999th=[ 709] 00:10:34.793 bw ( KiB/s): min=462336, max=740075, per=100.00%, avg=560625.00, stdev=6338.10, samples=266 00:10:34.793 iops : min=115584, max=185016, avg=140156.11, stdev=1584.51, samples=266 00:10:34.793 lat (usec) : 4=0.01%, 10=0.02%, 20=0.07%, 50=0.20%, 100=1.35% 00:10:34.793 lat (usec) : 250=45.12%, 500=52.24%, 750=1.01%, 1000=0.01% 00:10:34.793 lat (msec) : 2=0.01%, 4=0.01% 00:10:34.793 cpu : usr=99.59%, sys=0.00%, ctx=545, majf=0, minf=942 00:10:34.793 IO depths : 1=12.4%, 2=24.9%, 4=50.0%, 8=12.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:34.793 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:34.793 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:34.793 issued rwts: total=0,1391250,1391253,0 short=0,0,0,0 dropped=0,0,0,0 00:10:34.793 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:34.793 00:10:34.793 Run status group 0 (all jobs): 00:10:34.793 WRITE: bw=543MiB/s (570MB/s), 543MiB/s-543MiB/s (570MB/s-570MB/s), io=5435MiB (5699MB), run=10002-10002msec 00:10:34.793 TRIM: bw=543MiB/s (570MB/s), 543MiB/s-543MiB/s (570MB/s-570MB/s), io=5435MiB (5699MB), run=10002-10002msec 00:10:34.793 00:10:34.793 real 0m11.708s 00:10:34.793 user 2m25.850s 00:10:34.793 sys 0m0.849s 00:10:34.793 22:17:43 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:34.793 22:17:43 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:34.793 ************************************ 00:10:34.793 END TEST bdev_fio_trim 00:10:34.793 ************************************ 00:10:34.793 22:17:43 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:34.793 22:17:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:10:34.793 22:17:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:34.793 22:17:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:10:34.793 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:34.793 22:17:43 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:10:34.793 00:10:34.793 real 0m23.532s 00:10:34.793 user 5m10.653s 00:10:34.793 sys 0m2.207s 00:10:34.794 22:17:43 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:34.794 22:17:43 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:34.794 ************************************ 00:10:34.794 END TEST bdev_fio 00:10:34.794 ************************************ 00:10:34.794 22:17:43 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:34.794 22:17:43 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:34.794 22:17:43 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:34.794 22:17:43 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:34.794 22:17:43 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.794 22:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:34.794 ************************************ 00:10:34.794 START TEST bdev_verify 00:10:34.794 ************************************ 00:10:34.794 22:17:43 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:34.794 [2024-07-12 22:17:43.676973] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:34.794 [2024-07-12 22:17:43.677034] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3410217 ] 00:10:34.794 [2024-07-12 22:17:43.805749] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:34.794 [2024-07-12 22:17:43.908474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:34.794 [2024-07-12 22:17:43.908480] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.794 [2024-07-12 22:17:44.061038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:34.794 [2024-07-12 22:17:44.061100] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:34.794 [2024-07-12 22:17:44.061115] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:34.794 [2024-07-12 22:17:44.069047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:34.794 [2024-07-12 22:17:44.069074] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:34.794 [2024-07-12 22:17:44.077060] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:34.794 [2024-07-12 22:17:44.077085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:34.794 [2024-07-12 22:17:44.149328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:34.794 [2024-07-12 22:17:44.149379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:34.794 [2024-07-12 22:17:44.149399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277a4d0 00:10:34.794 [2024-07-12 22:17:44.149412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:34.794 [2024-07-12 22:17:44.151033] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:34.794 [2024-07-12 22:17:44.151062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:34.794 Running I/O for 5 seconds... 00:10:40.116 00:10:40.116 Latency(us) 00:10:40.116 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:40.116 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x1000 00:10:40.116 Malloc0 : 5.18 1137.69 4.44 0.00 0.00 112285.03 594.81 246187.41 00:10:40.116 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x1000 length 0x1000 00:10:40.116 Malloc0 : 5.20 1133.10 4.43 0.00 0.00 112739.46 598.37 386605.41 00:10:40.116 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x800 00:10:40.116 Malloc1p0 : 5.22 588.17 2.30 0.00 0.00 216425.59 3490.50 230686.72 00:10:40.116 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x800 length 0x800 00:10:40.116 Malloc1p0 : 5.20 590.92 2.31 0.00 0.00 215439.36 3490.50 219745.06 00:10:40.116 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x800 00:10:40.116 Malloc1p1 : 5.23 587.93 2.30 0.00 0.00 215919.74 3647.22 227039.50 00:10:40.116 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x800 length 0x800 00:10:40.116 Malloc1p1 : 5.20 590.66 2.31 0.00 0.00 214935.15 3647.22 215186.03 00:10:40.116 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p0 : 5.23 587.69 2.30 0.00 0.00 215398.08 3476.26 222480.47 00:10:40.116 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p0 : 5.20 590.39 2.31 0.00 0.00 214424.41 3462.01 208803.39 00:10:40.116 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p1 : 5.23 587.45 2.29 0.00 0.00 214860.52 3647.22 217921.45 00:10:40.116 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p1 : 5.21 590.13 2.31 0.00 0.00 213882.44 3632.97 204244.37 00:10:40.116 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p2 : 5.23 587.21 2.29 0.00 0.00 214289.73 3519.00 211538.81 00:10:40.116 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p2 : 5.21 589.89 2.30 0.00 0.00 213301.86 3561.74 198773.54 00:10:40.116 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p3 : 5.23 586.97 2.29 0.00 0.00 213766.69 3476.26 206979.78 00:10:40.116 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p3 : 5.21 589.65 2.30 0.00 0.00 212783.08 3476.26 194214.51 00:10:40.116 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p4 : 5.24 586.73 2.29 0.00 0.00 213283.98 3732.70 202420.76 00:10:40.116 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p4 : 5.21 589.39 2.30 0.00 0.00 212307.30 3590.23 189655.49 00:10:40.116 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p5 : 5.24 586.49 2.29 0.00 0.00 212713.39 3447.76 197861.73 00:10:40.116 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p5 : 5.21 589.13 2.30 0.00 0.00 211769.36 3462.01 185096.46 00:10:40.116 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p6 : 5.24 586.25 2.29 0.00 0.00 212175.24 3504.75 194214.51 00:10:40.116 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p6 : 5.22 588.88 2.30 0.00 0.00 211249.67 3504.75 179625.63 00:10:40.116 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x200 00:10:40.116 Malloc2p7 : 5.24 586.02 2.29 0.00 0.00 211637.64 3291.05 190567.29 00:10:40.116 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x200 length 0x200 00:10:40.116 Malloc2p7 : 5.22 588.61 2.30 0.00 0.00 210715.96 3319.54 175978.41 00:10:40.116 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x1000 00:10:40.116 TestPT : 5.26 583.68 2.28 0.00 0.00 211729.88 15956.59 188743.68 00:10:40.116 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x1000 length 0x1000 00:10:40.116 TestPT : 5.24 567.25 2.22 0.00 0.00 217249.77 14189.97 258952.68 00:10:40.116 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x2000 00:10:40.116 raid0 : 5.25 585.52 2.29 0.00 0.00 210385.99 3405.02 170507.58 00:10:40.116 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x2000 length 0x2000 00:10:40.116 raid0 : 5.26 608.71 2.38 0.00 0.00 202431.72 3447.76 154095.08 00:10:40.116 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x2000 00:10:40.116 concat0 : 5.25 585.26 2.29 0.00 0.00 209882.28 3490.50 166860.35 00:10:40.116 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x2000 length 0x2000 00:10:40.116 concat0 : 5.26 608.47 2.38 0.00 0.00 201945.00 3462.01 155006.89 00:10:40.116 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x1000 00:10:40.116 raid1 : 5.27 607.14 2.37 0.00 0.00 201804.90 2350.75 163213.13 00:10:40.116 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x1000 length 0x1000 00:10:40.116 raid1 : 5.26 608.23 2.38 0.00 0.00 201472.23 4103.12 161389.52 00:10:40.116 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x0 length 0x4e2 00:10:40.116 AIO0 : 5.27 606.97 2.37 0.00 0.00 201237.84 1560.04 165948.55 00:10:40.116 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:40.116 Verification LBA range: start 0x4e2 length 0x4e2 00:10:40.116 AIO0 : 5.26 608.01 2.38 0.00 0.00 200923.11 1581.41 168683.97 00:10:40.116 =================================================================================================================== 00:10:40.116 Total : 20008.61 78.16 0.00 0.00 199848.36 594.81 386605.41 00:10:40.116 00:10:40.116 real 0m6.489s 00:10:40.116 user 0m12.044s 00:10:40.116 sys 0m0.382s 00:10:40.116 22:17:50 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:40.116 22:17:50 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:40.116 ************************************ 00:10:40.116 END TEST bdev_verify 00:10:40.116 ************************************ 00:10:40.116 22:17:50 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:40.116 22:17:50 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:40.116 22:17:50 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:40.116 22:17:50 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:40.116 22:17:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:40.116 ************************************ 00:10:40.116 START TEST bdev_verify_big_io 00:10:40.116 ************************************ 00:10:40.116 22:17:50 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:40.116 [2024-07-12 22:17:50.259646] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:40.116 [2024-07-12 22:17:50.259710] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3411111 ] 00:10:40.117 [2024-07-12 22:17:50.388359] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:40.373 [2024-07-12 22:17:50.490747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:40.373 [2024-07-12 22:17:50.490753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.373 [2024-07-12 22:17:50.647797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:40.373 [2024-07-12 22:17:50.647849] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:40.373 [2024-07-12 22:17:50.647864] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:40.373 [2024-07-12 22:17:50.655803] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:40.373 [2024-07-12 22:17:50.655839] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:40.373 [2024-07-12 22:17:50.663819] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:40.373 [2024-07-12 22:17:50.663844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:40.630 [2024-07-12 22:17:50.740876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:40.630 [2024-07-12 22:17:50.740934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:40.630 [2024-07-12 22:17:50.740953] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd8e4d0 00:10:40.630 [2024-07-12 22:17:50.740966] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:40.630 [2024-07-12 22:17:50.742616] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:40.630 [2024-07-12 22:17:50.742647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:40.630 [2024-07-12 22:17:50.912105] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.913414] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.915326] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.916625] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.918534] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.919801] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.921497] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.922999] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.923960] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.925450] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.926406] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.927892] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.928870] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:40.630 [2024-07-12 22:17:50.930363] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:40.631 [2024-07-12 22:17:50.931278] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:40.631 [2024-07-12 22:17:50.932636] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:40.631 [2024-07-12 22:17:50.955534] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:40.887 [2024-07-12 22:17:50.957460] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:40.887 Running I/O for 5 seconds... 00:10:49.005 00:10:49.005 Latency(us) 00:10:49.005 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:49.005 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x100 00:10:49.005 Malloc0 : 5.97 150.20 9.39 0.00 0.00 835119.05 901.12 2377988.01 00:10:49.005 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x100 length 0x100 00:10:49.005 Malloc0 : 6.14 145.87 9.12 0.00 0.00 860597.15 872.63 2407165.77 00:10:49.005 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x80 00:10:49.005 Malloc1p0 : 6.51 49.74 3.11 0.00 0.00 2347133.03 2478.97 3997354.07 00:10:49.005 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x80 length 0x80 00:10:49.005 Malloc1p0 : 6.36 72.94 4.56 0.00 0.00 1635240.48 2977.61 2874010.05 00:10:49.005 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x80 00:10:49.005 Malloc1p1 : 6.91 34.75 2.17 0.00 0.00 3163946.79 1631.28 5339531.35 00:10:49.005 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x80 length 0x80 00:10:49.005 Malloc1p1 : 6.79 35.33 2.21 0.00 0.00 3170396.23 1595.66 5514597.95 00:10:49.005 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x20 00:10:49.005 Malloc2p0 : 6.24 23.08 1.44 0.00 0.00 1209149.25 641.11 2013265.92 00:10:49.005 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x20 length 0x20 00:10:49.005 Malloc2p0 : 6.28 22.92 1.43 0.00 0.00 1232594.01 655.36 2057032.57 00:10:49.005 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x20 00:10:49.005 Malloc2p1 : 6.31 25.35 1.58 0.00 0.00 1112341.98 644.67 1984088.15 00:10:49.005 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x20 length 0x20 00:10:49.005 Malloc2p1 : 6.28 22.92 1.43 0.00 0.00 1221162.23 644.67 2027854.80 00:10:49.005 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x20 00:10:49.005 Malloc2p2 : 6.31 25.34 1.58 0.00 0.00 1101934.25 648.24 1954910.39 00:10:49.005 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x20 length 0x20 00:10:49.005 Malloc2p2 : 6.28 22.91 1.43 0.00 0.00 1210645.68 644.67 2013265.92 00:10:49.005 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x20 00:10:49.005 Malloc2p3 : 6.31 25.34 1.58 0.00 0.00 1091717.73 644.67 1925732.62 00:10:49.005 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x20 length 0x20 00:10:49.005 Malloc2p3 : 6.29 22.91 1.43 0.00 0.00 1199350.98 651.80 1984088.15 00:10:49.005 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.005 Verification LBA range: start 0x0 length 0x20 00:10:49.005 Malloc2p4 : 6.32 25.33 1.58 0.00 0.00 1082004.54 648.24 1896554.85 00:10:49.006 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x20 length 0x20 00:10:49.006 Malloc2p4 : 6.29 22.90 1.43 0.00 0.00 1187685.03 655.36 1954910.39 00:10:49.006 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x20 00:10:49.006 Malloc2p5 : 6.32 25.33 1.58 0.00 0.00 1072285.86 687.42 1867377.09 00:10:49.006 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x20 length 0x20 00:10:49.006 Malloc2p5 : 6.29 22.90 1.43 0.00 0.00 1177220.17 683.85 1925732.62 00:10:49.006 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x20 00:10:49.006 Malloc2p6 : 6.32 25.32 1.58 0.00 0.00 1062257.65 737.28 1838199.32 00:10:49.006 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x20 length 0x20 00:10:49.006 Malloc2p6 : 6.29 22.89 1.43 0.00 0.00 1166254.07 726.59 1911143.74 00:10:49.006 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x20 00:10:49.006 Malloc2p7 : 6.32 25.32 1.58 0.00 0.00 1051812.94 765.77 1816315.99 00:10:49.006 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x20 length 0x20 00:10:49.006 Malloc2p7 : 6.36 25.14 1.57 0.00 0.00 1060439.76 765.77 1881965.97 00:10:49.006 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x100 00:10:49.006 TestPT : 6.94 34.72 2.17 0.00 0.00 2886981.51 99842.67 3968176.31 00:10:49.006 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x100 length 0x100 00:10:49.006 TestPT : 6.93 34.76 2.17 0.00 0.00 2881469.07 90724.62 3997354.07 00:10:49.006 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x200 00:10:49.006 raid0 : 6.95 41.43 2.59 0.00 0.00 2394967.83 2706.92 4697620.48 00:10:49.006 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x200 length 0x200 00:10:49.006 raid0 : 6.68 40.70 2.54 0.00 0.00 2419828.41 2749.66 4755976.01 00:10:49.006 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x200 00:10:49.006 concat0 : 6.74 51.65 3.23 0.00 0.00 1882140.08 2592.95 4522553.88 00:10:49.006 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x200 length 0x200 00:10:49.006 concat0 : 6.80 47.09 2.94 0.00 0.00 2051661.00 2493.22 4580909.41 00:10:49.006 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x100 00:10:49.006 raid1 : 6.95 65.94 4.12 0.00 0.00 1434621.18 2877.89 4318309.51 00:10:49.006 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x100 length 0x100 00:10:49.006 raid1 : 6.94 72.19 4.51 0.00 0.00 1314538.24 2592.95 4376665.04 00:10:49.006 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x0 length 0x4e 00:10:49.006 AIO0 : 6.95 54.68 3.42 0.00 0.00 1022498.65 808.51 2757298.98 00:10:49.006 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:49.006 Verification LBA range: start 0x4e length 0x4e 00:10:49.006 AIO0 : 6.94 48.11 3.01 0.00 0.00 1162777.50 826.32 2903187.81 00:10:49.006 =================================================================================================================== 00:10:49.006 Total : 1366.00 85.38 0.00 0.00 1517249.91 641.11 5514597.95 00:10:49.006 00:10:49.006 real 0m8.235s 00:10:49.006 user 0m15.468s 00:10:49.006 sys 0m0.439s 00:10:49.006 22:17:58 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:49.006 22:17:58 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:49.006 ************************************ 00:10:49.006 END TEST bdev_verify_big_io 00:10:49.006 ************************************ 00:10:49.006 22:17:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:49.006 22:17:58 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:49.006 22:17:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:49.006 22:17:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:49.006 22:17:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:49.006 ************************************ 00:10:49.006 START TEST bdev_write_zeroes 00:10:49.006 ************************************ 00:10:49.006 22:17:58 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:49.006 [2024-07-12 22:17:58.561688] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:49.006 [2024-07-12 22:17:58.561749] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3412180 ] 00:10:49.006 [2024-07-12 22:17:58.688912] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:49.006 [2024-07-12 22:17:58.789602] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.006 [2024-07-12 22:17:58.951150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:49.006 [2024-07-12 22:17:58.951214] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:49.006 [2024-07-12 22:17:58.951229] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:49.006 [2024-07-12 22:17:58.959160] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:49.006 [2024-07-12 22:17:58.959188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:49.006 [2024-07-12 22:17:58.967166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:49.006 [2024-07-12 22:17:58.967192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:49.006 [2024-07-12 22:17:59.044148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:49.006 [2024-07-12 22:17:59.044201] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:49.006 [2024-07-12 22:17:59.044221] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x139ac10 00:10:49.006 [2024-07-12 22:17:59.044234] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:49.006 [2024-07-12 22:17:59.045726] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:49.006 [2024-07-12 22:17:59.045756] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:49.006 Running I/O for 1 seconds... 00:10:50.382 00:10:50.382 Latency(us) 00:10:50.382 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:50.382 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.382 Malloc0 : 1.03 4970.80 19.42 0.00 0.00 25740.69 673.17 43310.75 00:10:50.382 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.382 Malloc1p0 : 1.03 4963.61 19.39 0.00 0.00 25729.65 918.93 42398.94 00:10:50.382 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.382 Malloc1p1 : 1.03 4956.44 19.36 0.00 0.00 25702.64 911.81 41487.14 00:10:50.382 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p0 : 1.03 4949.27 19.33 0.00 0.00 25680.56 908.24 40575.33 00:10:50.383 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p1 : 1.04 4942.14 19.31 0.00 0.00 25659.63 911.81 39663.53 00:10:50.383 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p2 : 1.06 4965.15 19.40 0.00 0.00 25489.60 911.81 38979.67 00:10:50.383 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p3 : 1.06 4958.12 19.37 0.00 0.00 25470.07 911.81 38067.87 00:10:50.383 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p4 : 1.06 4951.15 19.34 0.00 0.00 25452.72 908.24 37156.06 00:10:50.383 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p5 : 1.06 4944.25 19.31 0.00 0.00 25430.83 908.24 36244.26 00:10:50.383 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p6 : 1.06 4937.27 19.29 0.00 0.00 25409.71 897.56 35332.45 00:10:50.383 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 Malloc2p7 : 1.06 4930.41 19.26 0.00 0.00 25390.78 933.18 34420.65 00:10:50.383 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 TestPT : 1.07 4923.53 19.23 0.00 0.00 25369.94 954.55 33508.84 00:10:50.383 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 raid0 : 1.07 4915.59 19.20 0.00 0.00 25339.73 1624.15 31913.18 00:10:50.383 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 concat0 : 1.07 4907.81 19.17 0.00 0.00 25281.39 1609.91 30317.52 00:10:50.383 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 raid1 : 1.07 4898.06 19.13 0.00 0.00 25217.57 2578.70 27582.11 00:10:50.383 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:50.383 AIO0 : 1.07 4892.14 19.11 0.00 0.00 25135.44 1061.40 27240.18 00:10:50.383 =================================================================================================================== 00:10:50.383 Total : 79005.74 308.62 0.00 0.00 25467.01 673.17 43310.75 00:10:50.641 00:10:50.641 real 0m2.249s 00:10:50.641 user 0m1.832s 00:10:50.641 sys 0m0.339s 00:10:50.641 22:18:00 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.641 22:18:00 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:50.641 ************************************ 00:10:50.641 END TEST bdev_write_zeroes 00:10:50.641 ************************************ 00:10:50.641 22:18:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:50.641 22:18:00 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:50.641 22:18:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:50.641 22:18:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.641 22:18:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:50.641 ************************************ 00:10:50.641 START TEST bdev_json_nonenclosed 00:10:50.641 ************************************ 00:10:50.641 22:18:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:50.641 [2024-07-12 22:18:00.892750] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:50.641 [2024-07-12 22:18:00.892808] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3412603 ] 00:10:50.900 [2024-07-12 22:18:01.011046] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:50.900 [2024-07-12 22:18:01.108784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:50.900 [2024-07-12 22:18:01.108853] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:50.900 [2024-07-12 22:18:01.108874] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:50.900 [2024-07-12 22:18:01.108886] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:50.900 00:10:50.900 real 0m0.372s 00:10:50.900 user 0m0.226s 00:10:50.900 sys 0m0.144s 00:10:50.900 22:18:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:50.900 22:18:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.900 22:18:01 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:50.900 ************************************ 00:10:50.900 END TEST bdev_json_nonenclosed 00:10:50.900 ************************************ 00:10:51.159 22:18:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:51.159 22:18:01 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:10:51.159 22:18:01 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:51.159 22:18:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:51.159 22:18:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:51.159 22:18:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:51.159 ************************************ 00:10:51.159 START TEST bdev_json_nonarray 00:10:51.159 ************************************ 00:10:51.159 22:18:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:51.159 [2024-07-12 22:18:01.337350] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:51.159 [2024-07-12 22:18:01.337412] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3412652 ] 00:10:51.159 [2024-07-12 22:18:01.455526] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:51.419 [2024-07-12 22:18:01.555933] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:51.419 [2024-07-12 22:18:01.556014] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:51.419 [2024-07-12 22:18:01.556037] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:51.419 [2024-07-12 22:18:01.556050] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:51.419 00:10:51.419 real 0m0.382s 00:10:51.419 user 0m0.234s 00:10:51.419 sys 0m0.144s 00:10:51.419 22:18:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:10:51.419 22:18:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:51.419 22:18:01 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:51.419 ************************************ 00:10:51.419 END TEST bdev_json_nonarray 00:10:51.419 ************************************ 00:10:51.419 22:18:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:51.419 22:18:01 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:10:51.419 22:18:01 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:10:51.419 22:18:01 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:10:51.419 22:18:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:51.419 22:18:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:51.419 22:18:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:51.419 ************************************ 00:10:51.419 START TEST bdev_qos 00:10:51.419 ************************************ 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=3412709 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 3412709' 00:10:51.419 Process qos testing pid: 3412709 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 3412709 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 3412709 ']' 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:51.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:51.419 22:18:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:51.678 [2024-07-12 22:18:01.782857] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:10:51.678 [2024-07-12 22:18:01.782919] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3412709 ] 00:10:51.678 [2024-07-12 22:18:01.902837] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:51.936 [2024-07-12 22:18:02.013294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:51.936 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:51.936 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:10:51.936 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:51.936 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:51.936 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 [ 00:10:52.196 { 00:10:52.196 "name": "Malloc_0", 00:10:52.196 "aliases": [ 00:10:52.196 "7d2586f4-0114-40e4-994f-c9346a182a10" 00:10:52.196 ], 00:10:52.196 "product_name": "Malloc disk", 00:10:52.196 "block_size": 512, 00:10:52.196 "num_blocks": 262144, 00:10:52.196 "uuid": "7d2586f4-0114-40e4-994f-c9346a182a10", 00:10:52.196 "assigned_rate_limits": { 00:10:52.196 "rw_ios_per_sec": 0, 00:10:52.196 "rw_mbytes_per_sec": 0, 00:10:52.196 "r_mbytes_per_sec": 0, 00:10:52.196 "w_mbytes_per_sec": 0 00:10:52.196 }, 00:10:52.196 "claimed": false, 00:10:52.196 "zoned": false, 00:10:52.196 "supported_io_types": { 00:10:52.196 "read": true, 00:10:52.196 "write": true, 00:10:52.196 "unmap": true, 00:10:52.196 "flush": true, 00:10:52.196 "reset": true, 00:10:52.196 "nvme_admin": false, 00:10:52.196 "nvme_io": false, 00:10:52.196 "nvme_io_md": false, 00:10:52.196 "write_zeroes": true, 00:10:52.196 "zcopy": true, 00:10:52.196 "get_zone_info": false, 00:10:52.196 "zone_management": false, 00:10:52.196 "zone_append": false, 00:10:52.196 "compare": false, 00:10:52.196 "compare_and_write": false, 00:10:52.196 "abort": true, 00:10:52.196 "seek_hole": false, 00:10:52.196 "seek_data": false, 00:10:52.196 "copy": true, 00:10:52.196 "nvme_iov_md": false 00:10:52.196 }, 00:10:52.196 "memory_domains": [ 00:10:52.196 { 00:10:52.196 "dma_device_id": "system", 00:10:52.196 "dma_device_type": 1 00:10:52.196 }, 00:10:52.196 { 00:10:52.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.196 "dma_device_type": 2 00:10:52.196 } 00:10:52.196 ], 00:10:52.196 "driver_specific": {} 00:10:52.196 } 00:10:52.196 ] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 Null_1 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:52.196 [ 00:10:52.196 { 00:10:52.196 "name": "Null_1", 00:10:52.196 "aliases": [ 00:10:52.196 "07c6c35e-20ee-4db2-812c-3ec76b4cdfad" 00:10:52.196 ], 00:10:52.196 "product_name": "Null disk", 00:10:52.196 "block_size": 512, 00:10:52.196 "num_blocks": 262144, 00:10:52.196 "uuid": "07c6c35e-20ee-4db2-812c-3ec76b4cdfad", 00:10:52.196 "assigned_rate_limits": { 00:10:52.196 "rw_ios_per_sec": 0, 00:10:52.196 "rw_mbytes_per_sec": 0, 00:10:52.196 "r_mbytes_per_sec": 0, 00:10:52.196 "w_mbytes_per_sec": 0 00:10:52.196 }, 00:10:52.196 "claimed": false, 00:10:52.196 "zoned": false, 00:10:52.196 "supported_io_types": { 00:10:52.196 "read": true, 00:10:52.196 "write": true, 00:10:52.196 "unmap": false, 00:10:52.196 "flush": false, 00:10:52.196 "reset": true, 00:10:52.196 "nvme_admin": false, 00:10:52.196 "nvme_io": false, 00:10:52.196 "nvme_io_md": false, 00:10:52.196 "write_zeroes": true, 00:10:52.196 "zcopy": false, 00:10:52.196 "get_zone_info": false, 00:10:52.196 "zone_management": false, 00:10:52.196 "zone_append": false, 00:10:52.196 "compare": false, 00:10:52.196 "compare_and_write": false, 00:10:52.196 "abort": true, 00:10:52.196 "seek_hole": false, 00:10:52.196 "seek_data": false, 00:10:52.196 "copy": false, 00:10:52.196 "nvme_iov_md": false 00:10:52.196 }, 00:10:52.196 "driver_specific": {} 00:10:52.196 } 00:10:52.196 ] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:52.196 22:18:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:52.196 Running I/O for 60 seconds... 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 62129.90 248519.59 0.00 0.00 249856.00 0.00 0.00 ' 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=62129.90 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 62129 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=62129 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:10:57.469 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:57.470 22:18:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:57.470 ************************************ 00:10:57.470 START TEST bdev_qos_iops 00:10:57.470 ************************************ 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:57.470 22:18:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14997.61 59990.46 0.00 0.00 61140.00 0.00 0.00 ' 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14997.61 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14997 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14997 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14997 -lt 13500 ']' 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14997 -gt 16500 ']' 00:11:02.749 00:11:02.749 real 0m5.245s 00:11:02.749 user 0m0.113s 00:11:02.749 sys 0m0.043s 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:02.749 22:18:12 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:11:02.749 ************************************ 00:11:02.749 END TEST bdev_qos_iops 00:11:02.749 ************************************ 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:02.749 22:18:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20062.79 80251.18 0.00 0.00 81920.00 0.00 0.00 ' 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:11:08.017 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:08.018 22:18:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:08.018 ************************************ 00:11:08.018 START TEST bdev_qos_bw 00:11:08.018 ************************************ 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:08.018 22:18:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.64 8190.57 0.00 0.00 8452.00 0.00 0.00 ' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8452.00 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8452 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8452 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8452 -lt 7372 ']' 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8452 -gt 9011 ']' 00:11:13.375 00:11:13.375 real 0m5.310s 00:11:13.375 user 0m0.120s 00:11:13.375 sys 0m0.040s 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:11:13.375 ************************************ 00:11:13.375 END TEST bdev_qos_bw 00:11:13.375 ************************************ 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:13.375 22:18:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:13.375 ************************************ 00:11:13.375 START TEST bdev_qos_ro_bw 00:11:13.375 ************************************ 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:13.375 22:18:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.41 2045.65 0.00 0.00 2052.00 0.00 0.00 ' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:11:18.649 00:11:18.649 real 0m5.179s 00:11:18.649 user 0m0.115s 00:11:18.649 sys 0m0.043s 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:18.649 22:18:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:18.649 ************************************ 00:11:18.649 END TEST bdev_qos_ro_bw 00:11:18.649 ************************************ 00:11:18.649 22:18:28 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:18.649 22:18:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:18.649 22:18:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.649 22:18:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:19.218 00:11:19.218 Latency(us) 00:11:19.218 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:19.218 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:19.218 Malloc_0 : 26.82 20765.10 81.11 0.00 0.00 12214.79 2037.31 503316.48 00:11:19.218 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:19.218 Null_1 : 26.97 20401.29 79.69 0.00 0.00 12518.51 808.51 151359.67 00:11:19.218 =================================================================================================================== 00:11:19.218 Total : 41166.40 160.81 0.00 0.00 12365.73 808.51 503316.48 00:11:19.218 0 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 3412709 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 3412709 ']' 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 3412709 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:19.218 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3412709 00:11:19.478 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:19.478 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:19.478 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3412709' 00:11:19.478 killing process with pid 3412709 00:11:19.478 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 3412709 00:11:19.478 Received shutdown signal, test time was about 27.036855 seconds 00:11:19.478 00:11:19.478 Latency(us) 00:11:19.478 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:19.478 =================================================================================================================== 00:11:19.478 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:19.478 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 3412709 00:11:19.738 22:18:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:11:19.738 00:11:19.738 real 0m28.072s 00:11:19.738 user 0m28.861s 00:11:19.738 sys 0m0.814s 00:11:19.738 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:19.738 22:18:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:19.738 ************************************ 00:11:19.738 END TEST bdev_qos 00:11:19.738 ************************************ 00:11:19.738 22:18:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:19.738 22:18:29 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:19.738 22:18:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:19.738 22:18:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:19.738 22:18:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:19.738 ************************************ 00:11:19.738 START TEST bdev_qd_sampling 00:11:19.738 ************************************ 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=3416884 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 3416884' 00:11:19.738 Process bdev QD sampling period testing pid: 3416884 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 3416884 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 3416884 ']' 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:19.738 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:19.738 22:18:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:19.738 [2024-07-12 22:18:29.961916] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:19.738 [2024-07-12 22:18:29.962003] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3416884 ] 00:11:19.997 [2024-07-12 22:18:30.093954] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:19.997 [2024-07-12 22:18:30.193945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:19.997 [2024-07-12 22:18:30.193953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:20.932 Malloc_QD 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:20.932 [ 00:11:20.932 { 00:11:20.932 "name": "Malloc_QD", 00:11:20.932 "aliases": [ 00:11:20.932 "49c89e69-1fe6-4dfc-a3b2-22446f66f96f" 00:11:20.932 ], 00:11:20.932 "product_name": "Malloc disk", 00:11:20.932 "block_size": 512, 00:11:20.932 "num_blocks": 262144, 00:11:20.932 "uuid": "49c89e69-1fe6-4dfc-a3b2-22446f66f96f", 00:11:20.932 "assigned_rate_limits": { 00:11:20.932 "rw_ios_per_sec": 0, 00:11:20.932 "rw_mbytes_per_sec": 0, 00:11:20.932 "r_mbytes_per_sec": 0, 00:11:20.932 "w_mbytes_per_sec": 0 00:11:20.932 }, 00:11:20.932 "claimed": false, 00:11:20.932 "zoned": false, 00:11:20.932 "supported_io_types": { 00:11:20.932 "read": true, 00:11:20.932 "write": true, 00:11:20.932 "unmap": true, 00:11:20.932 "flush": true, 00:11:20.932 "reset": true, 00:11:20.932 "nvme_admin": false, 00:11:20.932 "nvme_io": false, 00:11:20.932 "nvme_io_md": false, 00:11:20.932 "write_zeroes": true, 00:11:20.932 "zcopy": true, 00:11:20.932 "get_zone_info": false, 00:11:20.932 "zone_management": false, 00:11:20.932 "zone_append": false, 00:11:20.932 "compare": false, 00:11:20.932 "compare_and_write": false, 00:11:20.932 "abort": true, 00:11:20.932 "seek_hole": false, 00:11:20.932 "seek_data": false, 00:11:20.932 "copy": true, 00:11:20.932 "nvme_iov_md": false 00:11:20.932 }, 00:11:20.932 "memory_domains": [ 00:11:20.932 { 00:11:20.932 "dma_device_id": "system", 00:11:20.932 "dma_device_type": 1 00:11:20.932 }, 00:11:20.932 { 00:11:20.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.932 "dma_device_type": 2 00:11:20.932 } 00:11:20.932 ], 00:11:20.932 "driver_specific": {} 00:11:20.932 } 00:11:20.932 ] 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:11:20.932 22:18:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:20.932 Running I/O for 5 seconds... 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:11:22.835 "tick_rate": 2300000000, 00:11:22.835 "ticks": 4837838338583312, 00:11:22.835 "bdevs": [ 00:11:22.835 { 00:11:22.835 "name": "Malloc_QD", 00:11:22.835 "bytes_read": 764457472, 00:11:22.835 "num_read_ops": 186628, 00:11:22.835 "bytes_written": 0, 00:11:22.835 "num_write_ops": 0, 00:11:22.835 "bytes_unmapped": 0, 00:11:22.835 "num_unmap_ops": 0, 00:11:22.835 "bytes_copied": 0, 00:11:22.835 "num_copy_ops": 0, 00:11:22.835 "read_latency_ticks": 2232218109360, 00:11:22.835 "max_read_latency_ticks": 14757454, 00:11:22.835 "min_read_latency_ticks": 256556, 00:11:22.835 "write_latency_ticks": 0, 00:11:22.835 "max_write_latency_ticks": 0, 00:11:22.835 "min_write_latency_ticks": 0, 00:11:22.835 "unmap_latency_ticks": 0, 00:11:22.835 "max_unmap_latency_ticks": 0, 00:11:22.835 "min_unmap_latency_ticks": 0, 00:11:22.835 "copy_latency_ticks": 0, 00:11:22.835 "max_copy_latency_ticks": 0, 00:11:22.835 "min_copy_latency_ticks": 0, 00:11:22.835 "io_error": {}, 00:11:22.835 "queue_depth_polling_period": 10, 00:11:22.835 "queue_depth": 512, 00:11:22.835 "io_time": 20, 00:11:22.835 "weighted_io_time": 10240 00:11:22.835 } 00:11:22.835 ] 00:11:22.835 }' 00:11:22.835 22:18:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:22.835 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:22.836 00:11:22.836 Latency(us) 00:11:22.836 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:22.836 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:22.836 Malloc_QD : 1.97 48955.38 191.23 0.00 0.00 5216.33 1446.07 5584.81 00:11:22.836 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:22.836 Malloc_QD : 1.97 49414.14 193.02 0.00 0.00 5168.41 1004.41 6439.62 00:11:22.836 =================================================================================================================== 00:11:22.836 Total : 98369.52 384.26 0.00 0.00 5192.24 1004.41 6439.62 00:11:22.836 0 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 3416884 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 3416884 ']' 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 3416884 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3416884 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3416884' 00:11:22.836 killing process with pid 3416884 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 3416884 00:11:22.836 Received shutdown signal, test time was about 2.054720 seconds 00:11:22.836 00:11:22.836 Latency(us) 00:11:22.836 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:22.836 =================================================================================================================== 00:11:22.836 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:22.836 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 3416884 00:11:23.094 22:18:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:11:23.094 00:11:23.094 real 0m3.475s 00:11:23.094 user 0m6.801s 00:11:23.094 sys 0m0.430s 00:11:23.094 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:23.094 22:18:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:23.094 ************************************ 00:11:23.094 END TEST bdev_qd_sampling 00:11:23.094 ************************************ 00:11:23.094 22:18:33 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:23.094 22:18:33 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:11:23.094 22:18:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:23.094 22:18:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:23.094 22:18:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:23.353 ************************************ 00:11:23.353 START TEST bdev_error 00:11:23.353 ************************************ 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=3417441 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 3417441' 00:11:23.353 Process error testing pid: 3417441 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:23.353 22:18:33 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 3417441 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 3417441 ']' 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:23.353 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:23.353 22:18:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:23.353 [2024-07-12 22:18:33.525272] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:23.353 [2024-07-12 22:18:33.525338] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3417441 ] 00:11:23.353 [2024-07-12 22:18:33.644240] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:23.611 [2024-07-12 22:18:33.746324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:24.179 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.179 Dev_1 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.179 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.179 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.438 [ 00:11:24.438 { 00:11:24.438 "name": "Dev_1", 00:11:24.438 "aliases": [ 00:11:24.438 "eb1426fd-c529-4ec6-a339-dffa7b6c273d" 00:11:24.438 ], 00:11:24.438 "product_name": "Malloc disk", 00:11:24.438 "block_size": 512, 00:11:24.438 "num_blocks": 262144, 00:11:24.438 "uuid": "eb1426fd-c529-4ec6-a339-dffa7b6c273d", 00:11:24.438 "assigned_rate_limits": { 00:11:24.438 "rw_ios_per_sec": 0, 00:11:24.438 "rw_mbytes_per_sec": 0, 00:11:24.438 "r_mbytes_per_sec": 0, 00:11:24.438 "w_mbytes_per_sec": 0 00:11:24.438 }, 00:11:24.438 "claimed": false, 00:11:24.438 "zoned": false, 00:11:24.438 "supported_io_types": { 00:11:24.438 "read": true, 00:11:24.438 "write": true, 00:11:24.438 "unmap": true, 00:11:24.438 "flush": true, 00:11:24.438 "reset": true, 00:11:24.438 "nvme_admin": false, 00:11:24.438 "nvme_io": false, 00:11:24.438 "nvme_io_md": false, 00:11:24.438 "write_zeroes": true, 00:11:24.438 "zcopy": true, 00:11:24.438 "get_zone_info": false, 00:11:24.438 "zone_management": false, 00:11:24.438 "zone_append": false, 00:11:24.438 "compare": false, 00:11:24.438 "compare_and_write": false, 00:11:24.438 "abort": true, 00:11:24.438 "seek_hole": false, 00:11:24.438 "seek_data": false, 00:11:24.438 "copy": true, 00:11:24.438 "nvme_iov_md": false 00:11:24.438 }, 00:11:24.438 "memory_domains": [ 00:11:24.438 { 00:11:24.438 "dma_device_id": "system", 00:11:24.438 "dma_device_type": 1 00:11:24.438 }, 00:11:24.438 { 00:11:24.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.438 "dma_device_type": 2 00:11:24.438 } 00:11:24.438 ], 00:11:24.438 "driver_specific": {} 00:11:24.438 } 00:11:24.438 ] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:24.438 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.438 true 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.438 Dev_2 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.438 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.438 [ 00:11:24.438 { 00:11:24.438 "name": "Dev_2", 00:11:24.438 "aliases": [ 00:11:24.439 "e1fd352a-2489-41d3-85ca-1e3ff971f6ed" 00:11:24.439 ], 00:11:24.439 "product_name": "Malloc disk", 00:11:24.439 "block_size": 512, 00:11:24.439 "num_blocks": 262144, 00:11:24.439 "uuid": "e1fd352a-2489-41d3-85ca-1e3ff971f6ed", 00:11:24.439 "assigned_rate_limits": { 00:11:24.439 "rw_ios_per_sec": 0, 00:11:24.439 "rw_mbytes_per_sec": 0, 00:11:24.439 "r_mbytes_per_sec": 0, 00:11:24.439 "w_mbytes_per_sec": 0 00:11:24.439 }, 00:11:24.439 "claimed": false, 00:11:24.439 "zoned": false, 00:11:24.439 "supported_io_types": { 00:11:24.439 "read": true, 00:11:24.439 "write": true, 00:11:24.439 "unmap": true, 00:11:24.439 "flush": true, 00:11:24.439 "reset": true, 00:11:24.439 "nvme_admin": false, 00:11:24.439 "nvme_io": false, 00:11:24.439 "nvme_io_md": false, 00:11:24.439 "write_zeroes": true, 00:11:24.439 "zcopy": true, 00:11:24.439 "get_zone_info": false, 00:11:24.439 "zone_management": false, 00:11:24.439 "zone_append": false, 00:11:24.439 "compare": false, 00:11:24.439 "compare_and_write": false, 00:11:24.439 "abort": true, 00:11:24.439 "seek_hole": false, 00:11:24.439 "seek_data": false, 00:11:24.439 "copy": true, 00:11:24.439 "nvme_iov_md": false 00:11:24.439 }, 00:11:24.439 "memory_domains": [ 00:11:24.439 { 00:11:24.439 "dma_device_id": "system", 00:11:24.439 "dma_device_type": 1 00:11:24.439 }, 00:11:24.439 { 00:11:24.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.439 "dma_device_type": 2 00:11:24.439 } 00:11:24.439 ], 00:11:24.439 "driver_specific": {} 00:11:24.439 } 00:11:24.439 ] 00:11:24.439 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.439 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:24.439 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:24.439 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.439 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.439 22:18:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.439 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:11:24.439 22:18:34 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:24.439 Running I/O for 5 seconds... 00:11:25.375 22:18:35 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 3417441 00:11:25.375 22:18:35 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 3417441' 00:11:25.375 Process is existed as continue on error is set. Pid: 3417441 00:11:25.375 22:18:35 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:25.375 22:18:35 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:25.375 22:18:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:25.375 22:18:35 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:11:25.635 Timeout while waiting for response: 00:11:25.635 00:11:25.635 00:11:29.822 00:11:29.822 Latency(us) 00:11:29.822 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:29.822 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:29.822 EE_Dev_1 : 0.90 37227.85 145.42 5.57 0.00 426.19 133.57 701.66 00:11:29.822 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:29.822 Dev_2 : 5.00 80452.04 314.27 0.00 0.00 195.37 74.80 22795.13 00:11:29.822 =================================================================================================================== 00:11:29.822 Total : 117679.89 459.69 5.57 0.00 213.08 74.80 22795.13 00:11:30.389 22:18:40 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 3417441 00:11:30.389 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 3417441 ']' 00:11:30.389 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 3417441 00:11:30.389 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:11:30.389 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:30.389 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3417441 00:11:30.648 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:30.648 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:30.648 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3417441' 00:11:30.648 killing process with pid 3417441 00:11:30.648 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 3417441 00:11:30.648 Received shutdown signal, test time was about 5.000000 seconds 00:11:30.648 00:11:30.648 Latency(us) 00:11:30.648 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:30.648 =================================================================================================================== 00:11:30.648 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:30.648 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 3417441 00:11:30.906 22:18:40 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=3418332 00:11:30.906 22:18:40 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 3418332' 00:11:30.906 22:18:40 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:30.907 Process error testing pid: 3418332 00:11:30.907 22:18:40 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 3418332 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 3418332 ']' 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:30.907 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:30.907 22:18:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:30.907 [2024-07-12 22:18:41.037947] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:30.907 [2024-07-12 22:18:41.038017] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3418332 ] 00:11:30.907 [2024-07-12 22:18:41.156085] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:31.165 [2024-07-12 22:18:41.261138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:31.747 22:18:41 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:31.747 22:18:41 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:31.747 22:18:41 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:31.747 22:18:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.747 22:18:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:31.747 Dev_1 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:31.747 [ 00:11:31.747 { 00:11:31.747 "name": "Dev_1", 00:11:31.747 "aliases": [ 00:11:31.747 "a3dc29b1-accf-483d-b6f7-8f19e6cda854" 00:11:31.747 ], 00:11:31.747 "product_name": "Malloc disk", 00:11:31.747 "block_size": 512, 00:11:31.747 "num_blocks": 262144, 00:11:31.747 "uuid": "a3dc29b1-accf-483d-b6f7-8f19e6cda854", 00:11:31.747 "assigned_rate_limits": { 00:11:31.747 "rw_ios_per_sec": 0, 00:11:31.747 "rw_mbytes_per_sec": 0, 00:11:31.747 "r_mbytes_per_sec": 0, 00:11:31.747 "w_mbytes_per_sec": 0 00:11:31.747 }, 00:11:31.747 "claimed": false, 00:11:31.747 "zoned": false, 00:11:31.747 "supported_io_types": { 00:11:31.747 "read": true, 00:11:31.747 "write": true, 00:11:31.747 "unmap": true, 00:11:31.747 "flush": true, 00:11:31.747 "reset": true, 00:11:31.747 "nvme_admin": false, 00:11:31.747 "nvme_io": false, 00:11:31.747 "nvme_io_md": false, 00:11:31.747 "write_zeroes": true, 00:11:31.747 "zcopy": true, 00:11:31.747 "get_zone_info": false, 00:11:31.747 "zone_management": false, 00:11:31.747 "zone_append": false, 00:11:31.747 "compare": false, 00:11:31.747 "compare_and_write": false, 00:11:31.747 "abort": true, 00:11:31.747 "seek_hole": false, 00:11:31.747 "seek_data": false, 00:11:31.747 "copy": true, 00:11:31.747 "nvme_iov_md": false 00:11:31.747 }, 00:11:31.747 "memory_domains": [ 00:11:31.747 { 00:11:31.747 "dma_device_id": "system", 00:11:31.747 "dma_device_type": 1 00:11:31.747 }, 00:11:31.747 { 00:11:31.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.747 "dma_device_type": 2 00:11:31.747 } 00:11:31.747 ], 00:11:31.747 "driver_specific": {} 00:11:31.747 } 00:11:31.747 ] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:31.747 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:31.747 true 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.747 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.747 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:32.009 Dev_2 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:32.009 [ 00:11:32.009 { 00:11:32.009 "name": "Dev_2", 00:11:32.009 "aliases": [ 00:11:32.009 "d9d19b58-86c7-4062-ad51-3273abf0658a" 00:11:32.009 ], 00:11:32.009 "product_name": "Malloc disk", 00:11:32.009 "block_size": 512, 00:11:32.009 "num_blocks": 262144, 00:11:32.009 "uuid": "d9d19b58-86c7-4062-ad51-3273abf0658a", 00:11:32.009 "assigned_rate_limits": { 00:11:32.009 "rw_ios_per_sec": 0, 00:11:32.009 "rw_mbytes_per_sec": 0, 00:11:32.009 "r_mbytes_per_sec": 0, 00:11:32.009 "w_mbytes_per_sec": 0 00:11:32.009 }, 00:11:32.009 "claimed": false, 00:11:32.009 "zoned": false, 00:11:32.009 "supported_io_types": { 00:11:32.009 "read": true, 00:11:32.009 "write": true, 00:11:32.009 "unmap": true, 00:11:32.009 "flush": true, 00:11:32.009 "reset": true, 00:11:32.009 "nvme_admin": false, 00:11:32.009 "nvme_io": false, 00:11:32.009 "nvme_io_md": false, 00:11:32.009 "write_zeroes": true, 00:11:32.009 "zcopy": true, 00:11:32.009 "get_zone_info": false, 00:11:32.009 "zone_management": false, 00:11:32.009 "zone_append": false, 00:11:32.009 "compare": false, 00:11:32.009 "compare_and_write": false, 00:11:32.009 "abort": true, 00:11:32.009 "seek_hole": false, 00:11:32.009 "seek_data": false, 00:11:32.009 "copy": true, 00:11:32.009 "nvme_iov_md": false 00:11:32.009 }, 00:11:32.009 "memory_domains": [ 00:11:32.009 { 00:11:32.009 "dma_device_id": "system", 00:11:32.009 "dma_device_type": 1 00:11:32.009 }, 00:11:32.009 { 00:11:32.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.009 "dma_device_type": 2 00:11:32.009 } 00:11:32.009 ], 00:11:32.009 "driver_specific": {} 00:11:32.009 } 00:11:32.009 ] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:32.009 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:32.009 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 3418332 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 3418332 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:11:32.009 22:18:42 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:32.009 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 3418332 00:11:32.009 Running I/O for 5 seconds... 00:11:32.009 task offset: 197696 on job bdev=EE_Dev_1 fails 00:11:32.009 00:11:32.009 Latency(us) 00:11:32.009 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:32.009 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:32.009 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:32.009 EE_Dev_1 : 0.00 29530.20 115.35 6711.41 0.00 359.98 132.67 641.11 00:11:32.009 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:32.009 Dev_2 : 0.00 18212.86 71.14 0.00 0.00 652.72 126.44 1210.99 00:11:32.009 =================================================================================================================== 00:11:32.010 Total : 47743.06 186.50 6711.41 0.00 518.75 126.44 1210.99 00:11:32.010 [2024-07-12 22:18:42.249957] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:32.010 request: 00:11:32.010 { 00:11:32.010 "method": "perform_tests", 00:11:32.010 "req_id": 1 00:11:32.010 } 00:11:32.010 Got JSON-RPC error response 00:11:32.010 response: 00:11:32.010 { 00:11:32.010 "code": -32603, 00:11:32.010 "message": "bdevperf failed with error Operation not permitted" 00:11:32.010 } 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:32.268 00:11:32.268 real 0m9.056s 00:11:32.268 user 0m9.489s 00:11:32.268 sys 0m0.858s 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:32.268 22:18:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:32.268 ************************************ 00:11:32.268 END TEST bdev_error 00:11:32.268 ************************************ 00:11:32.268 22:18:42 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:32.268 22:18:42 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:11:32.268 22:18:42 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:32.268 22:18:42 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:32.268 22:18:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:32.527 ************************************ 00:11:32.527 START TEST bdev_stat 00:11:32.527 ************************************ 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=3418653 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 3418653' 00:11:32.527 Process Bdev IO statistics testing pid: 3418653 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 3418653 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 3418653 ']' 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:32.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:32.527 22:18:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:32.527 [2024-07-12 22:18:42.641194] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:32.527 [2024-07-12 22:18:42.641243] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3418653 ] 00:11:32.527 [2024-07-12 22:18:42.752920] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:32.785 [2024-07-12 22:18:42.861305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:32.785 [2024-07-12 22:18:42.861311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:33.352 Malloc_STAT 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:33.352 [ 00:11:33.352 { 00:11:33.352 "name": "Malloc_STAT", 00:11:33.352 "aliases": [ 00:11:33.352 "b1b5e1f8-a671-46b8-97a9-71df0a3e6430" 00:11:33.352 ], 00:11:33.352 "product_name": "Malloc disk", 00:11:33.352 "block_size": 512, 00:11:33.352 "num_blocks": 262144, 00:11:33.352 "uuid": "b1b5e1f8-a671-46b8-97a9-71df0a3e6430", 00:11:33.352 "assigned_rate_limits": { 00:11:33.352 "rw_ios_per_sec": 0, 00:11:33.352 "rw_mbytes_per_sec": 0, 00:11:33.352 "r_mbytes_per_sec": 0, 00:11:33.352 "w_mbytes_per_sec": 0 00:11:33.352 }, 00:11:33.352 "claimed": false, 00:11:33.352 "zoned": false, 00:11:33.352 "supported_io_types": { 00:11:33.352 "read": true, 00:11:33.352 "write": true, 00:11:33.352 "unmap": true, 00:11:33.352 "flush": true, 00:11:33.352 "reset": true, 00:11:33.352 "nvme_admin": false, 00:11:33.352 "nvme_io": false, 00:11:33.352 "nvme_io_md": false, 00:11:33.352 "write_zeroes": true, 00:11:33.352 "zcopy": true, 00:11:33.352 "get_zone_info": false, 00:11:33.352 "zone_management": false, 00:11:33.352 "zone_append": false, 00:11:33.352 "compare": false, 00:11:33.352 "compare_and_write": false, 00:11:33.352 "abort": true, 00:11:33.352 "seek_hole": false, 00:11:33.352 "seek_data": false, 00:11:33.352 "copy": true, 00:11:33.352 "nvme_iov_md": false 00:11:33.352 }, 00:11:33.352 "memory_domains": [ 00:11:33.352 { 00:11:33.352 "dma_device_id": "system", 00:11:33.352 "dma_device_type": 1 00:11:33.352 }, 00:11:33.352 { 00:11:33.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.352 "dma_device_type": 2 00:11:33.352 } 00:11:33.352 ], 00:11:33.352 "driver_specific": {} 00:11:33.352 } 00:11:33.352 ] 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.352 22:18:43 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:11:33.609 22:18:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:11:33.609 22:18:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:33.609 Running I/O for 10 seconds... 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:11:35.509 "tick_rate": 2300000000, 00:11:35.509 "ticks": 4837867491121656, 00:11:35.509 "bdevs": [ 00:11:35.509 { 00:11:35.509 "name": "Malloc_STAT", 00:11:35.509 "bytes_read": 758166016, 00:11:35.509 "num_read_ops": 185092, 00:11:35.509 "bytes_written": 0, 00:11:35.509 "num_write_ops": 0, 00:11:35.509 "bytes_unmapped": 0, 00:11:35.509 "num_unmap_ops": 0, 00:11:35.509 "bytes_copied": 0, 00:11:35.509 "num_copy_ops": 0, 00:11:35.509 "read_latency_ticks": 2231292450226, 00:11:35.509 "max_read_latency_ticks": 14569198, 00:11:35.509 "min_read_latency_ticks": 270274, 00:11:35.509 "write_latency_ticks": 0, 00:11:35.509 "max_write_latency_ticks": 0, 00:11:35.509 "min_write_latency_ticks": 0, 00:11:35.509 "unmap_latency_ticks": 0, 00:11:35.509 "max_unmap_latency_ticks": 0, 00:11:35.509 "min_unmap_latency_ticks": 0, 00:11:35.509 "copy_latency_ticks": 0, 00:11:35.509 "max_copy_latency_ticks": 0, 00:11:35.509 "min_copy_latency_ticks": 0, 00:11:35.509 "io_error": {} 00:11:35.509 } 00:11:35.509 ] 00:11:35.509 }' 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=185092 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.509 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:11:35.509 "tick_rate": 2300000000, 00:11:35.509 "ticks": 4837867650347556, 00:11:35.509 "name": "Malloc_STAT", 00:11:35.509 "channels": [ 00:11:35.509 { 00:11:35.509 "thread_id": 2, 00:11:35.509 "bytes_read": 389021696, 00:11:35.509 "num_read_ops": 94976, 00:11:35.509 "bytes_written": 0, 00:11:35.509 "num_write_ops": 0, 00:11:35.509 "bytes_unmapped": 0, 00:11:35.509 "num_unmap_ops": 0, 00:11:35.509 "bytes_copied": 0, 00:11:35.509 "num_copy_ops": 0, 00:11:35.509 "read_latency_ticks": 1155921607792, 00:11:35.509 "max_read_latency_ticks": 12904488, 00:11:35.509 "min_read_latency_ticks": 8187226, 00:11:35.509 "write_latency_ticks": 0, 00:11:35.509 "max_write_latency_ticks": 0, 00:11:35.509 "min_write_latency_ticks": 0, 00:11:35.509 "unmap_latency_ticks": 0, 00:11:35.509 "max_unmap_latency_ticks": 0, 00:11:35.509 "min_unmap_latency_ticks": 0, 00:11:35.509 "copy_latency_ticks": 0, 00:11:35.509 "max_copy_latency_ticks": 0, 00:11:35.509 "min_copy_latency_ticks": 0 00:11:35.509 }, 00:11:35.509 { 00:11:35.509 "thread_id": 3, 00:11:35.509 "bytes_read": 397410304, 00:11:35.510 "num_read_ops": 97024, 00:11:35.510 "bytes_written": 0, 00:11:35.510 "num_write_ops": 0, 00:11:35.510 "bytes_unmapped": 0, 00:11:35.510 "num_unmap_ops": 0, 00:11:35.510 "bytes_copied": 0, 00:11:35.510 "num_copy_ops": 0, 00:11:35.510 "read_latency_ticks": 1158694519814, 00:11:35.510 "max_read_latency_ticks": 14569198, 00:11:35.510 "min_read_latency_ticks": 8082868, 00:11:35.510 "write_latency_ticks": 0, 00:11:35.510 "max_write_latency_ticks": 0, 00:11:35.510 "min_write_latency_ticks": 0, 00:11:35.510 "unmap_latency_ticks": 0, 00:11:35.510 "max_unmap_latency_ticks": 0, 00:11:35.510 "min_unmap_latency_ticks": 0, 00:11:35.510 "copy_latency_ticks": 0, 00:11:35.510 "max_copy_latency_ticks": 0, 00:11:35.510 "min_copy_latency_ticks": 0 00:11:35.510 } 00:11:35.510 ] 00:11:35.510 }' 00:11:35.510 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:11:35.510 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=94976 00:11:35.510 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=94976 00:11:35.510 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=97024 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=192000 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:11:35.779 "tick_rate": 2300000000, 00:11:35.779 "ticks": 4837867919755380, 00:11:35.779 "bdevs": [ 00:11:35.779 { 00:11:35.779 "name": "Malloc_STAT", 00:11:35.779 "bytes_read": 832614912, 00:11:35.779 "num_read_ops": 203268, 00:11:35.779 "bytes_written": 0, 00:11:35.779 "num_write_ops": 0, 00:11:35.779 "bytes_unmapped": 0, 00:11:35.779 "num_unmap_ops": 0, 00:11:35.779 "bytes_copied": 0, 00:11:35.779 "num_copy_ops": 0, 00:11:35.779 "read_latency_ticks": 2450751567920, 00:11:35.779 "max_read_latency_ticks": 14569198, 00:11:35.779 "min_read_latency_ticks": 270274, 00:11:35.779 "write_latency_ticks": 0, 00:11:35.779 "max_write_latency_ticks": 0, 00:11:35.779 "min_write_latency_ticks": 0, 00:11:35.779 "unmap_latency_ticks": 0, 00:11:35.779 "max_unmap_latency_ticks": 0, 00:11:35.779 "min_unmap_latency_ticks": 0, 00:11:35.779 "copy_latency_ticks": 0, 00:11:35.779 "max_copy_latency_ticks": 0, 00:11:35.779 "min_copy_latency_ticks": 0, 00:11:35.779 "io_error": {} 00:11:35.779 } 00:11:35.779 ] 00:11:35.779 }' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=203268 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192000 -lt 185092 ']' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 192000 -gt 203268 ']' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:35.779 00:11:35.779 Latency(us) 00:11:35.779 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:35.779 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:35.779 Malloc_STAT : 2.16 48278.49 188.59 0.00 0.00 5289.86 1467.44 5613.30 00:11:35.779 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:35.779 Malloc_STAT : 2.17 49283.00 192.51 0.00 0.00 5182.66 997.29 6354.14 00:11:35.779 =================================================================================================================== 00:11:35.779 Total : 97561.49 381.10 0.00 0.00 5235.67 997.29 6354.14 00:11:35.779 0 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 3418653 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 3418653 ']' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 3418653 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:35.779 22:18:45 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3418653 00:11:35.779 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:35.779 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:35.779 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3418653' 00:11:35.779 killing process with pid 3418653 00:11:35.779 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 3418653 00:11:35.779 Received shutdown signal, test time was about 2.245534 seconds 00:11:35.779 00:11:35.780 Latency(us) 00:11:35.780 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:35.780 =================================================================================================================== 00:11:35.780 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:35.780 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 3418653 00:11:36.070 22:18:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:11:36.070 00:11:36.070 real 0m3.659s 00:11:36.070 user 0m7.389s 00:11:36.070 sys 0m0.450s 00:11:36.070 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:36.070 22:18:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:36.070 ************************************ 00:11:36.070 END TEST bdev_stat 00:11:36.070 ************************************ 00:11:36.070 22:18:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:11:36.070 22:18:46 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:11:36.070 00:11:36.070 real 1m55.960s 00:11:36.070 user 7m9.963s 00:11:36.071 sys 0m23.027s 00:11:36.071 22:18:46 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:36.071 22:18:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:36.071 ************************************ 00:11:36.071 END TEST blockdev_general 00:11:36.071 ************************************ 00:11:36.071 22:18:46 -- common/autotest_common.sh@1142 -- # return 0 00:11:36.071 22:18:46 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:36.071 22:18:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:36.071 22:18:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:36.071 22:18:46 -- common/autotest_common.sh@10 -- # set +x 00:11:36.339 ************************************ 00:11:36.339 START TEST bdev_raid 00:11:36.339 ************************************ 00:11:36.339 22:18:46 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:36.339 * Looking for test storage... 00:11:36.339 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:36.339 22:18:46 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:11:36.339 22:18:46 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:36.339 22:18:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:36.339 22:18:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:36.339 22:18:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:36.339 ************************************ 00:11:36.339 START TEST raid_function_test_raid0 00:11:36.339 ************************************ 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=3419233 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 3419233' 00:11:36.339 Process raid pid: 3419233 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 3419233 /var/tmp/spdk-raid.sock 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 3419233 ']' 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:36.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:36.339 22:18:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:36.339 [2024-07-12 22:18:46.639377] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:36.339 [2024-07-12 22:18:46.639446] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:36.599 [2024-07-12 22:18:46.770168] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.599 [2024-07-12 22:18:46.881614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:36.858 [2024-07-12 22:18:46.952599] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.858 [2024-07-12 22:18:46.952636] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:37.425 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:37.684 [2024-07-12 22:18:47.820683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:37.684 [2024-07-12 22:18:47.822160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:37.684 [2024-07-12 22:18:47.822218] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19dfbd0 00:11:37.684 [2024-07-12 22:18:47.822228] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:37.685 [2024-07-12 22:18:47.822413] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19dfb10 00:11:37.685 [2024-07-12 22:18:47.822530] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19dfbd0 00:11:37.685 [2024-07-12 22:18:47.822540] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x19dfbd0 00:11:37.685 [2024-07-12 22:18:47.822640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.685 Base_1 00:11:37.685 Base_2 00:11:37.685 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:37.685 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:37.685 22:18:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:37.944 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:38.202 [2024-07-12 22:18:48.318025] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b938e0 00:11:38.202 /dev/nbd0 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:38.202 1+0 records in 00:11:38.202 1+0 records out 00:11:38.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256714 s, 16.0 MB/s 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:38.202 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:38.461 { 00:11:38.461 "nbd_device": "/dev/nbd0", 00:11:38.461 "bdev_name": "raid" 00:11:38.461 } 00:11:38.461 ]' 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:38.461 { 00:11:38.461 "nbd_device": "/dev/nbd0", 00:11:38.461 "bdev_name": "raid" 00:11:38.461 } 00:11:38.461 ]' 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:38.461 4096+0 records in 00:11:38.461 4096+0 records out 00:11:38.461 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0307541 s, 68.2 MB/s 00:11:38.461 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:38.720 4096+0 records in 00:11:38.720 4096+0 records out 00:11:38.720 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.205497 s, 10.2 MB/s 00:11:38.720 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:38.720 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:38.721 128+0 records in 00:11:38.721 128+0 records out 00:11:38.721 65536 bytes (66 kB, 64 KiB) copied, 0.000830087 s, 79.0 MB/s 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:38.721 2035+0 records in 00:11:38.721 2035+0 records out 00:11:38.721 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0114822 s, 90.7 MB/s 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:38.721 456+0 records in 00:11:38.721 456+0 records out 00:11:38.721 233472 bytes (233 kB, 228 KiB) copied, 0.00272505 s, 85.7 MB/s 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:38.721 22:18:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:38.980 [2024-07-12 22:18:49.246617] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:38.980 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 3419233 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 3419233 ']' 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 3419233 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:39.239 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3419233 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3419233' 00:11:39.497 killing process with pid 3419233 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 3419233 00:11:39.497 [2024-07-12 22:18:49.606621] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:39.497 [2024-07-12 22:18:49.606685] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.497 [2024-07-12 22:18:49.606729] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.497 [2024-07-12 22:18:49.606743] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19dfbd0 name raid, state offline 00:11:39.497 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 3419233 00:11:39.497 [2024-07-12 22:18:49.623321] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:39.755 22:18:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:39.755 00:11:39.755 real 0m3.247s 00:11:39.755 user 0m4.315s 00:11:39.755 sys 0m1.222s 00:11:39.755 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:39.755 22:18:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:39.755 ************************************ 00:11:39.755 END TEST raid_function_test_raid0 00:11:39.755 ************************************ 00:11:39.755 22:18:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:39.755 22:18:49 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:39.755 22:18:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:39.755 22:18:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:39.755 22:18:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:39.755 ************************************ 00:11:39.755 START TEST raid_function_test_concat 00:11:39.755 ************************************ 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=3419756 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 3419756' 00:11:39.755 Process raid pid: 3419756 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 3419756 /var/tmp/spdk-raid.sock 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 3419756 ']' 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:39.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:39.755 22:18:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:39.755 [2024-07-12 22:18:49.960580] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:39.755 [2024-07-12 22:18:49.960645] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:40.014 [2024-07-12 22:18:50.093341] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:40.014 [2024-07-12 22:18:50.199320] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.014 [2024-07-12 22:18:50.271540] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.014 [2024-07-12 22:18:50.271576] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:40.580 22:18:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:40.838 [2024-07-12 22:18:51.154536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:40.838 [2024-07-12 22:18:51.155983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:40.838 [2024-07-12 22:18:51.156040] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c42bd0 00:11:40.838 [2024-07-12 22:18:51.156051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:40.838 [2024-07-12 22:18:51.156234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c42b10 00:11:40.838 [2024-07-12 22:18:51.156355] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c42bd0 00:11:40.838 [2024-07-12 22:18:51.156365] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1c42bd0 00:11:40.838 [2024-07-12 22:18:51.156466] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:40.838 Base_1 00:11:40.838 Base_2 00:11:41.096 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:41.096 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:41.096 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:41.355 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:41.355 [2024-07-12 22:18:51.663902] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df68e0 00:11:41.355 /dev/nbd0 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:41.613 1+0 records in 00:11:41.613 1+0 records out 00:11:41.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257839 s, 15.9 MB/s 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:41.613 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:41.871 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:41.871 { 00:11:41.871 "nbd_device": "/dev/nbd0", 00:11:41.871 "bdev_name": "raid" 00:11:41.871 } 00:11:41.871 ]' 00:11:41.871 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:41.871 { 00:11:41.871 "nbd_device": "/dev/nbd0", 00:11:41.871 "bdev_name": "raid" 00:11:41.871 } 00:11:41.871 ]' 00:11:41.871 22:18:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:41.871 4096+0 records in 00:11:41.871 4096+0 records out 00:11:41.871 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0292061 s, 71.8 MB/s 00:11:41.871 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:42.129 4096+0 records in 00:11:42.129 4096+0 records out 00:11:42.129 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.206011 s, 10.2 MB/s 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:42.129 128+0 records in 00:11:42.129 128+0 records out 00:11:42.129 65536 bytes (66 kB, 64 KiB) copied, 0.000833173 s, 78.7 MB/s 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:42.129 2035+0 records in 00:11:42.129 2035+0 records out 00:11:42.129 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0114464 s, 91.0 MB/s 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:42.129 456+0 records in 00:11:42.129 456+0 records out 00:11:42.129 233472 bytes (233 kB, 228 KiB) copied, 0.00271266 s, 86.1 MB/s 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:42.129 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:42.386 [2024-07-12 22:18:52.587661] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:42.386 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 3419756 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 3419756 ']' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 3419756 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3419756 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3419756' 00:11:42.644 killing process with pid 3419756 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 3419756 00:11:42.644 22:18:52 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 3419756 00:11:42.644 [2024-07-12 22:18:52.951127] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:42.644 [2024-07-12 22:18:52.951198] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:42.644 [2024-07-12 22:18:52.951241] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:42.644 [2024-07-12 22:18:52.951253] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c42bd0 name raid, state offline 00:11:42.644 [2024-07-12 22:18:52.968544] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:42.901 22:18:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:42.902 00:11:42.902 real 0m3.289s 00:11:42.902 user 0m4.419s 00:11:42.902 sys 0m1.171s 00:11:42.902 22:18:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:42.902 22:18:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:42.902 ************************************ 00:11:42.902 END TEST raid_function_test_concat 00:11:42.902 ************************************ 00:11:43.160 22:18:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:43.160 22:18:53 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:43.160 22:18:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:43.160 22:18:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:43.160 22:18:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:43.160 ************************************ 00:11:43.160 START TEST raid0_resize_test 00:11:43.160 ************************************ 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=3420270 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 3420270' 00:11:43.160 Process raid pid: 3420270 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 3420270 /var/tmp/spdk-raid.sock 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 3420270 ']' 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:43.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:43.160 22:18:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.160 [2024-07-12 22:18:53.338022] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:43.160 [2024-07-12 22:18:53.338090] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:43.160 [2024-07-12 22:18:53.468080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:43.419 [2024-07-12 22:18:53.576902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:43.419 [2024-07-12 22:18:53.645205] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:43.419 [2024-07-12 22:18:53.645242] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:43.985 22:18:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:43.985 22:18:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:43.985 22:18:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:44.244 Base_1 00:11:44.244 22:18:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:44.502 Base_2 00:11:44.502 22:18:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:44.761 [2024-07-12 22:18:54.965637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:44.761 [2024-07-12 22:18:54.967060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:44.761 [2024-07-12 22:18:54.967109] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2168780 00:11:44.761 [2024-07-12 22:18:54.967119] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:44.761 [2024-07-12 22:18:54.967322] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb4020 00:11:44.761 [2024-07-12 22:18:54.967414] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2168780 00:11:44.761 [2024-07-12 22:18:54.967424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2168780 00:11:44.761 [2024-07-12 22:18:54.967534] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:44.761 22:18:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:45.019 [2024-07-12 22:18:55.194229] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:45.019 [2024-07-12 22:18:55.194252] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:45.019 true 00:11:45.019 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:45.019 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:45.278 [2024-07-12 22:18:55.439034] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.278 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:45.278 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:45.278 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:45.278 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:45.537 [2024-07-12 22:18:55.683503] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:45.537 [2024-07-12 22:18:55.683527] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:45.537 [2024-07-12 22:18:55.683553] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:45.537 true 00:11:45.537 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:45.537 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:45.795 [2024-07-12 22:18:55.928311] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 3420270 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 3420270 ']' 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 3420270 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3420270 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3420270' 00:11:45.795 killing process with pid 3420270 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 3420270 00:11:45.795 [2024-07-12 22:18:55.996014] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:45.795 [2024-07-12 22:18:55.996069] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:45.795 [2024-07-12 22:18:55.996112] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:45.795 [2024-07-12 22:18:55.996124] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2168780 name Raid, state offline 00:11:45.795 22:18:55 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 3420270 00:11:45.795 [2024-07-12 22:18:55.997494] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:46.054 22:18:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:46.054 00:11:46.054 real 0m2.931s 00:11:46.054 user 0m4.520s 00:11:46.054 sys 0m0.627s 00:11:46.054 22:18:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:46.054 22:18:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.054 ************************************ 00:11:46.054 END TEST raid0_resize_test 00:11:46.054 ************************************ 00:11:46.054 22:18:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:46.054 22:18:56 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:46.054 22:18:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:46.054 22:18:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:46.054 22:18:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:46.054 22:18:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:46.054 22:18:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:46.054 ************************************ 00:11:46.054 START TEST raid_state_function_test 00:11:46.054 ************************************ 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3420748 00:11:46.054 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3420748' 00:11:46.054 Process raid pid: 3420748 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3420748 /var/tmp/spdk-raid.sock 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3420748 ']' 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:46.055 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:46.055 22:18:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.055 [2024-07-12 22:18:56.340275] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:46.055 [2024-07-12 22:18:56.340339] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:46.313 [2024-07-12 22:18:56.470440] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:46.313 [2024-07-12 22:18:56.579908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:46.572 [2024-07-12 22:18:56.640596] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.572 [2024-07-12 22:18:56.640624] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.139 22:18:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:47.139 22:18:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:47.139 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:47.397 [2024-07-12 22:18:57.495157] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:47.397 [2024-07-12 22:18:57.495202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:47.397 [2024-07-12 22:18:57.495213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:47.397 [2024-07-12 22:18:57.495225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.397 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.656 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.656 "name": "Existed_Raid", 00:11:47.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.656 "strip_size_kb": 64, 00:11:47.656 "state": "configuring", 00:11:47.656 "raid_level": "raid0", 00:11:47.656 "superblock": false, 00:11:47.656 "num_base_bdevs": 2, 00:11:47.656 "num_base_bdevs_discovered": 0, 00:11:47.656 "num_base_bdevs_operational": 2, 00:11:47.656 "base_bdevs_list": [ 00:11:47.656 { 00:11:47.656 "name": "BaseBdev1", 00:11:47.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.657 "is_configured": false, 00:11:47.657 "data_offset": 0, 00:11:47.657 "data_size": 0 00:11:47.657 }, 00:11:47.657 { 00:11:47.657 "name": "BaseBdev2", 00:11:47.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.657 "is_configured": false, 00:11:47.657 "data_offset": 0, 00:11:47.657 "data_size": 0 00:11:47.657 } 00:11:47.657 ] 00:11:47.657 }' 00:11:47.657 22:18:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.657 22:18:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.222 22:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:48.222 [2024-07-12 22:18:58.485657] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:48.222 [2024-07-12 22:18:58.485691] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b5a80 name Existed_Raid, state configuring 00:11:48.222 22:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.480 [2024-07-12 22:18:58.730323] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:48.480 [2024-07-12 22:18:58.730351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:48.480 [2024-07-12 22:18:58.730361] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.480 [2024-07-12 22:18:58.730373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.480 22:18:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:48.739 [2024-07-12 22:18:58.988935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.739 BaseBdev1 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:48.739 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:49.003 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:49.261 [ 00:11:49.261 { 00:11:49.261 "name": "BaseBdev1", 00:11:49.261 "aliases": [ 00:11:49.261 "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46" 00:11:49.261 ], 00:11:49.261 "product_name": "Malloc disk", 00:11:49.261 "block_size": 512, 00:11:49.261 "num_blocks": 65536, 00:11:49.261 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:49.261 "assigned_rate_limits": { 00:11:49.261 "rw_ios_per_sec": 0, 00:11:49.261 "rw_mbytes_per_sec": 0, 00:11:49.261 "r_mbytes_per_sec": 0, 00:11:49.261 "w_mbytes_per_sec": 0 00:11:49.261 }, 00:11:49.261 "claimed": true, 00:11:49.261 "claim_type": "exclusive_write", 00:11:49.261 "zoned": false, 00:11:49.261 "supported_io_types": { 00:11:49.261 "read": true, 00:11:49.261 "write": true, 00:11:49.261 "unmap": true, 00:11:49.261 "flush": true, 00:11:49.261 "reset": true, 00:11:49.261 "nvme_admin": false, 00:11:49.261 "nvme_io": false, 00:11:49.261 "nvme_io_md": false, 00:11:49.261 "write_zeroes": true, 00:11:49.261 "zcopy": true, 00:11:49.261 "get_zone_info": false, 00:11:49.261 "zone_management": false, 00:11:49.261 "zone_append": false, 00:11:49.261 "compare": false, 00:11:49.261 "compare_and_write": false, 00:11:49.261 "abort": true, 00:11:49.261 "seek_hole": false, 00:11:49.261 "seek_data": false, 00:11:49.262 "copy": true, 00:11:49.262 "nvme_iov_md": false 00:11:49.262 }, 00:11:49.262 "memory_domains": [ 00:11:49.262 { 00:11:49.262 "dma_device_id": "system", 00:11:49.262 "dma_device_type": 1 00:11:49.262 }, 00:11:49.262 { 00:11:49.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.262 "dma_device_type": 2 00:11:49.262 } 00:11:49.262 ], 00:11:49.262 "driver_specific": {} 00:11:49.262 } 00:11:49.262 ] 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.262 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.521 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.521 "name": "Existed_Raid", 00:11:49.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.521 "strip_size_kb": 64, 00:11:49.521 "state": "configuring", 00:11:49.521 "raid_level": "raid0", 00:11:49.521 "superblock": false, 00:11:49.521 "num_base_bdevs": 2, 00:11:49.521 "num_base_bdevs_discovered": 1, 00:11:49.521 "num_base_bdevs_operational": 2, 00:11:49.521 "base_bdevs_list": [ 00:11:49.521 { 00:11:49.521 "name": "BaseBdev1", 00:11:49.521 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:49.521 "is_configured": true, 00:11:49.521 "data_offset": 0, 00:11:49.521 "data_size": 65536 00:11:49.521 }, 00:11:49.521 { 00:11:49.521 "name": "BaseBdev2", 00:11:49.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.521 "is_configured": false, 00:11:49.521 "data_offset": 0, 00:11:49.521 "data_size": 0 00:11:49.521 } 00:11:49.521 ] 00:11:49.521 }' 00:11:49.521 22:18:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.521 22:18:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.088 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:50.349 [2024-07-12 22:19:00.577154] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:50.349 [2024-07-12 22:19:00.577197] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b5350 name Existed_Raid, state configuring 00:11:50.349 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:50.610 [2024-07-12 22:19:00.821821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:50.610 [2024-07-12 22:19:00.823327] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:50.610 [2024-07-12 22:19:00.823360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.610 22:19:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.869 22:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.869 "name": "Existed_Raid", 00:11:50.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.869 "strip_size_kb": 64, 00:11:50.869 "state": "configuring", 00:11:50.869 "raid_level": "raid0", 00:11:50.869 "superblock": false, 00:11:50.869 "num_base_bdevs": 2, 00:11:50.869 "num_base_bdevs_discovered": 1, 00:11:50.869 "num_base_bdevs_operational": 2, 00:11:50.869 "base_bdevs_list": [ 00:11:50.869 { 00:11:50.869 "name": "BaseBdev1", 00:11:50.869 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:50.869 "is_configured": true, 00:11:50.869 "data_offset": 0, 00:11:50.869 "data_size": 65536 00:11:50.869 }, 00:11:50.869 { 00:11:50.869 "name": "BaseBdev2", 00:11:50.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.869 "is_configured": false, 00:11:50.869 "data_offset": 0, 00:11:50.869 "data_size": 0 00:11:50.869 } 00:11:50.869 ] 00:11:50.869 }' 00:11:50.869 22:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.869 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.435 22:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:51.694 [2024-07-12 22:19:01.847971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:51.694 [2024-07-12 22:19:01.848011] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b6000 00:11:51.694 [2024-07-12 22:19:01.848019] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:51.694 [2024-07-12 22:19:01.848207] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10d00c0 00:11:51.694 [2024-07-12 22:19:01.848333] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b6000 00:11:51.694 [2024-07-12 22:19:01.848348] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11b6000 00:11:51.694 [2024-07-12 22:19:01.848522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:51.694 BaseBdev2 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:51.694 22:19:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:51.952 22:19:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:52.225 [ 00:11:52.225 { 00:11:52.225 "name": "BaseBdev2", 00:11:52.225 "aliases": [ 00:11:52.225 "bac41496-72dc-47d4-94f8-4a86e0eba275" 00:11:52.225 ], 00:11:52.225 "product_name": "Malloc disk", 00:11:52.225 "block_size": 512, 00:11:52.225 "num_blocks": 65536, 00:11:52.225 "uuid": "bac41496-72dc-47d4-94f8-4a86e0eba275", 00:11:52.225 "assigned_rate_limits": { 00:11:52.225 "rw_ios_per_sec": 0, 00:11:52.225 "rw_mbytes_per_sec": 0, 00:11:52.225 "r_mbytes_per_sec": 0, 00:11:52.225 "w_mbytes_per_sec": 0 00:11:52.225 }, 00:11:52.225 "claimed": true, 00:11:52.225 "claim_type": "exclusive_write", 00:11:52.225 "zoned": false, 00:11:52.225 "supported_io_types": { 00:11:52.225 "read": true, 00:11:52.225 "write": true, 00:11:52.225 "unmap": true, 00:11:52.225 "flush": true, 00:11:52.225 "reset": true, 00:11:52.225 "nvme_admin": false, 00:11:52.225 "nvme_io": false, 00:11:52.225 "nvme_io_md": false, 00:11:52.225 "write_zeroes": true, 00:11:52.225 "zcopy": true, 00:11:52.225 "get_zone_info": false, 00:11:52.225 "zone_management": false, 00:11:52.225 "zone_append": false, 00:11:52.225 "compare": false, 00:11:52.225 "compare_and_write": false, 00:11:52.225 "abort": true, 00:11:52.225 "seek_hole": false, 00:11:52.225 "seek_data": false, 00:11:52.225 "copy": true, 00:11:52.225 "nvme_iov_md": false 00:11:52.225 }, 00:11:52.225 "memory_domains": [ 00:11:52.225 { 00:11:52.225 "dma_device_id": "system", 00:11:52.225 "dma_device_type": 1 00:11:52.225 }, 00:11:52.225 { 00:11:52.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.225 "dma_device_type": 2 00:11:52.225 } 00:11:52.225 ], 00:11:52.226 "driver_specific": {} 00:11:52.226 } 00:11:52.226 ] 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.226 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.498 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.498 "name": "Existed_Raid", 00:11:52.498 "uuid": "8ff112b2-4772-4408-8657-0f489f5a10a3", 00:11:52.498 "strip_size_kb": 64, 00:11:52.498 "state": "online", 00:11:52.498 "raid_level": "raid0", 00:11:52.498 "superblock": false, 00:11:52.498 "num_base_bdevs": 2, 00:11:52.498 "num_base_bdevs_discovered": 2, 00:11:52.498 "num_base_bdevs_operational": 2, 00:11:52.499 "base_bdevs_list": [ 00:11:52.499 { 00:11:52.499 "name": "BaseBdev1", 00:11:52.499 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:52.499 "is_configured": true, 00:11:52.499 "data_offset": 0, 00:11:52.499 "data_size": 65536 00:11:52.499 }, 00:11:52.499 { 00:11:52.499 "name": "BaseBdev2", 00:11:52.499 "uuid": "bac41496-72dc-47d4-94f8-4a86e0eba275", 00:11:52.499 "is_configured": true, 00:11:52.499 "data_offset": 0, 00:11:52.499 "data_size": 65536 00:11:52.499 } 00:11:52.499 ] 00:11:52.499 }' 00:11:52.499 22:19:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.499 22:19:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:53.064 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:53.323 [2024-07-12 22:19:03.420415] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:53.323 "name": "Existed_Raid", 00:11:53.323 "aliases": [ 00:11:53.323 "8ff112b2-4772-4408-8657-0f489f5a10a3" 00:11:53.323 ], 00:11:53.323 "product_name": "Raid Volume", 00:11:53.323 "block_size": 512, 00:11:53.323 "num_blocks": 131072, 00:11:53.323 "uuid": "8ff112b2-4772-4408-8657-0f489f5a10a3", 00:11:53.323 "assigned_rate_limits": { 00:11:53.323 "rw_ios_per_sec": 0, 00:11:53.323 "rw_mbytes_per_sec": 0, 00:11:53.323 "r_mbytes_per_sec": 0, 00:11:53.323 "w_mbytes_per_sec": 0 00:11:53.323 }, 00:11:53.323 "claimed": false, 00:11:53.323 "zoned": false, 00:11:53.323 "supported_io_types": { 00:11:53.323 "read": true, 00:11:53.323 "write": true, 00:11:53.323 "unmap": true, 00:11:53.323 "flush": true, 00:11:53.323 "reset": true, 00:11:53.323 "nvme_admin": false, 00:11:53.323 "nvme_io": false, 00:11:53.323 "nvme_io_md": false, 00:11:53.323 "write_zeroes": true, 00:11:53.323 "zcopy": false, 00:11:53.323 "get_zone_info": false, 00:11:53.323 "zone_management": false, 00:11:53.323 "zone_append": false, 00:11:53.323 "compare": false, 00:11:53.323 "compare_and_write": false, 00:11:53.323 "abort": false, 00:11:53.323 "seek_hole": false, 00:11:53.323 "seek_data": false, 00:11:53.323 "copy": false, 00:11:53.323 "nvme_iov_md": false 00:11:53.323 }, 00:11:53.323 "memory_domains": [ 00:11:53.323 { 00:11:53.323 "dma_device_id": "system", 00:11:53.323 "dma_device_type": 1 00:11:53.323 }, 00:11:53.323 { 00:11:53.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.323 "dma_device_type": 2 00:11:53.323 }, 00:11:53.323 { 00:11:53.323 "dma_device_id": "system", 00:11:53.323 "dma_device_type": 1 00:11:53.323 }, 00:11:53.323 { 00:11:53.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.323 "dma_device_type": 2 00:11:53.323 } 00:11:53.323 ], 00:11:53.323 "driver_specific": { 00:11:53.323 "raid": { 00:11:53.323 "uuid": "8ff112b2-4772-4408-8657-0f489f5a10a3", 00:11:53.323 "strip_size_kb": 64, 00:11:53.323 "state": "online", 00:11:53.323 "raid_level": "raid0", 00:11:53.323 "superblock": false, 00:11:53.323 "num_base_bdevs": 2, 00:11:53.323 "num_base_bdevs_discovered": 2, 00:11:53.323 "num_base_bdevs_operational": 2, 00:11:53.323 "base_bdevs_list": [ 00:11:53.323 { 00:11:53.323 "name": "BaseBdev1", 00:11:53.323 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:53.323 "is_configured": true, 00:11:53.323 "data_offset": 0, 00:11:53.323 "data_size": 65536 00:11:53.323 }, 00:11:53.323 { 00:11:53.323 "name": "BaseBdev2", 00:11:53.323 "uuid": "bac41496-72dc-47d4-94f8-4a86e0eba275", 00:11:53.323 "is_configured": true, 00:11:53.323 "data_offset": 0, 00:11:53.323 "data_size": 65536 00:11:53.323 } 00:11:53.323 ] 00:11:53.323 } 00:11:53.323 } 00:11:53.323 }' 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:53.323 BaseBdev2' 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:53.323 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:53.581 "name": "BaseBdev1", 00:11:53.581 "aliases": [ 00:11:53.581 "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46" 00:11:53.581 ], 00:11:53.581 "product_name": "Malloc disk", 00:11:53.581 "block_size": 512, 00:11:53.581 "num_blocks": 65536, 00:11:53.581 "uuid": "ce89e14c-bd2e-47d1-90f2-7e66dbbc9a46", 00:11:53.581 "assigned_rate_limits": { 00:11:53.581 "rw_ios_per_sec": 0, 00:11:53.581 "rw_mbytes_per_sec": 0, 00:11:53.581 "r_mbytes_per_sec": 0, 00:11:53.581 "w_mbytes_per_sec": 0 00:11:53.581 }, 00:11:53.581 "claimed": true, 00:11:53.581 "claim_type": "exclusive_write", 00:11:53.581 "zoned": false, 00:11:53.581 "supported_io_types": { 00:11:53.581 "read": true, 00:11:53.581 "write": true, 00:11:53.581 "unmap": true, 00:11:53.581 "flush": true, 00:11:53.581 "reset": true, 00:11:53.581 "nvme_admin": false, 00:11:53.581 "nvme_io": false, 00:11:53.581 "nvme_io_md": false, 00:11:53.581 "write_zeroes": true, 00:11:53.581 "zcopy": true, 00:11:53.581 "get_zone_info": false, 00:11:53.581 "zone_management": false, 00:11:53.581 "zone_append": false, 00:11:53.581 "compare": false, 00:11:53.581 "compare_and_write": false, 00:11:53.581 "abort": true, 00:11:53.581 "seek_hole": false, 00:11:53.581 "seek_data": false, 00:11:53.581 "copy": true, 00:11:53.581 "nvme_iov_md": false 00:11:53.581 }, 00:11:53.581 "memory_domains": [ 00:11:53.581 { 00:11:53.581 "dma_device_id": "system", 00:11:53.581 "dma_device_type": 1 00:11:53.581 }, 00:11:53.581 { 00:11:53.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.581 "dma_device_type": 2 00:11:53.581 } 00:11:53.581 ], 00:11:53.581 "driver_specific": {} 00:11:53.581 }' 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:53.581 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:53.840 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:53.840 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:53.840 22:19:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:53.840 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:53.840 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:53.840 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:53.840 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:53.840 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:54.098 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:54.098 "name": "BaseBdev2", 00:11:54.098 "aliases": [ 00:11:54.098 "bac41496-72dc-47d4-94f8-4a86e0eba275" 00:11:54.098 ], 00:11:54.098 "product_name": "Malloc disk", 00:11:54.098 "block_size": 512, 00:11:54.098 "num_blocks": 65536, 00:11:54.098 "uuid": "bac41496-72dc-47d4-94f8-4a86e0eba275", 00:11:54.098 "assigned_rate_limits": { 00:11:54.098 "rw_ios_per_sec": 0, 00:11:54.098 "rw_mbytes_per_sec": 0, 00:11:54.098 "r_mbytes_per_sec": 0, 00:11:54.098 "w_mbytes_per_sec": 0 00:11:54.098 }, 00:11:54.098 "claimed": true, 00:11:54.098 "claim_type": "exclusive_write", 00:11:54.098 "zoned": false, 00:11:54.098 "supported_io_types": { 00:11:54.098 "read": true, 00:11:54.098 "write": true, 00:11:54.098 "unmap": true, 00:11:54.098 "flush": true, 00:11:54.098 "reset": true, 00:11:54.098 "nvme_admin": false, 00:11:54.098 "nvme_io": false, 00:11:54.098 "nvme_io_md": false, 00:11:54.098 "write_zeroes": true, 00:11:54.098 "zcopy": true, 00:11:54.098 "get_zone_info": false, 00:11:54.098 "zone_management": false, 00:11:54.098 "zone_append": false, 00:11:54.098 "compare": false, 00:11:54.098 "compare_and_write": false, 00:11:54.098 "abort": true, 00:11:54.098 "seek_hole": false, 00:11:54.098 "seek_data": false, 00:11:54.098 "copy": true, 00:11:54.098 "nvme_iov_md": false 00:11:54.098 }, 00:11:54.098 "memory_domains": [ 00:11:54.098 { 00:11:54.098 "dma_device_id": "system", 00:11:54.098 "dma_device_type": 1 00:11:54.098 }, 00:11:54.098 { 00:11:54.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.098 "dma_device_type": 2 00:11:54.098 } 00:11:54.098 ], 00:11:54.098 "driver_specific": {} 00:11:54.098 }' 00:11:54.098 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.098 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.098 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:54.098 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.356 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:54.615 [2024-07-12 22:19:04.892105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:54.615 [2024-07-12 22:19:04.892130] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:54.615 [2024-07-12 22:19:04.892172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.615 22:19:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.873 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.873 "name": "Existed_Raid", 00:11:54.873 "uuid": "8ff112b2-4772-4408-8657-0f489f5a10a3", 00:11:54.873 "strip_size_kb": 64, 00:11:54.873 "state": "offline", 00:11:54.873 "raid_level": "raid0", 00:11:54.873 "superblock": false, 00:11:54.873 "num_base_bdevs": 2, 00:11:54.873 "num_base_bdevs_discovered": 1, 00:11:54.873 "num_base_bdevs_operational": 1, 00:11:54.873 "base_bdevs_list": [ 00:11:54.873 { 00:11:54.873 "name": null, 00:11:54.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.873 "is_configured": false, 00:11:54.873 "data_offset": 0, 00:11:54.873 "data_size": 65536 00:11:54.873 }, 00:11:54.873 { 00:11:54.873 "name": "BaseBdev2", 00:11:54.873 "uuid": "bac41496-72dc-47d4-94f8-4a86e0eba275", 00:11:54.873 "is_configured": true, 00:11:54.873 "data_offset": 0, 00:11:54.873 "data_size": 65536 00:11:54.873 } 00:11:54.873 ] 00:11:54.873 }' 00:11:54.873 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.874 22:19:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.440 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:55.440 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:55.440 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.440 22:19:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:55.698 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:55.698 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:55.698 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:55.956 [2024-07-12 22:19:06.228651] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:55.956 [2024-07-12 22:19:06.228706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b6000 name Existed_Raid, state offline 00:11:55.956 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:55.956 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:55.956 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.956 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3420748 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3420748 ']' 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3420748 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:56.214 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3420748 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3420748' 00:11:56.473 killing process with pid 3420748 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3420748 00:11:56.473 [2024-07-12 22:19:06.557394] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3420748 00:11:56.473 [2024-07-12 22:19:06.558298] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:56.473 00:11:56.473 real 0m10.492s 00:11:56.473 user 0m18.687s 00:11:56.473 sys 0m1.900s 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:56.473 22:19:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.473 ************************************ 00:11:56.473 END TEST raid_state_function_test 00:11:56.473 ************************************ 00:11:56.732 22:19:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:56.732 22:19:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:56.732 22:19:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:56.732 22:19:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:56.732 22:19:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:56.732 ************************************ 00:11:56.732 START TEST raid_state_function_test_sb 00:11:56.732 ************************************ 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3422316 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3422316' 00:11:56.732 Process raid pid: 3422316 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3422316 /var/tmp/spdk-raid.sock 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3422316 ']' 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:56.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:56.732 22:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.732 [2024-07-12 22:19:06.917016] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:11:56.732 [2024-07-12 22:19:06.917085] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:56.732 [2024-07-12 22:19:07.047975] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:56.990 [2024-07-12 22:19:07.145761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:56.990 [2024-07-12 22:19:07.210118] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:56.990 [2024-07-12 22:19:07.210149] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.556 22:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:57.556 22:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:57.556 22:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:57.814 [2024-07-12 22:19:08.072459] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:57.814 [2024-07-12 22:19:08.072502] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:57.814 [2024-07-12 22:19:08.072517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:57.814 [2024-07-12 22:19:08.072529] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.814 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.073 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.073 "name": "Existed_Raid", 00:11:58.073 "uuid": "b5ba0373-e3fc-46b0-b7a2-92aae95b413b", 00:11:58.073 "strip_size_kb": 64, 00:11:58.073 "state": "configuring", 00:11:58.073 "raid_level": "raid0", 00:11:58.073 "superblock": true, 00:11:58.073 "num_base_bdevs": 2, 00:11:58.073 "num_base_bdevs_discovered": 0, 00:11:58.073 "num_base_bdevs_operational": 2, 00:11:58.073 "base_bdevs_list": [ 00:11:58.073 { 00:11:58.073 "name": "BaseBdev1", 00:11:58.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.073 "is_configured": false, 00:11:58.073 "data_offset": 0, 00:11:58.073 "data_size": 0 00:11:58.073 }, 00:11:58.073 { 00:11:58.073 "name": "BaseBdev2", 00:11:58.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.073 "is_configured": false, 00:11:58.073 "data_offset": 0, 00:11:58.073 "data_size": 0 00:11:58.073 } 00:11:58.073 ] 00:11:58.073 }' 00:11:58.073 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.073 22:19:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:58.654 22:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:58.912 [2024-07-12 22:19:09.167207] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:58.912 [2024-07-12 22:19:09.167240] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x135ea80 name Existed_Raid, state configuring 00:11:58.912 22:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:59.170 [2024-07-12 22:19:09.411878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:59.170 [2024-07-12 22:19:09.411909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:59.170 [2024-07-12 22:19:09.411919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:59.170 [2024-07-12 22:19:09.411939] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:59.170 22:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:59.428 [2024-07-12 22:19:09.666493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:59.428 BaseBdev1 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:59.428 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:59.686 22:19:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:59.945 [ 00:11:59.945 { 00:11:59.945 "name": "BaseBdev1", 00:11:59.945 "aliases": [ 00:11:59.945 "0706606c-8f12-428e-88f4-5cb11cc62c3a" 00:11:59.945 ], 00:11:59.945 "product_name": "Malloc disk", 00:11:59.945 "block_size": 512, 00:11:59.945 "num_blocks": 65536, 00:11:59.945 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:11:59.945 "assigned_rate_limits": { 00:11:59.945 "rw_ios_per_sec": 0, 00:11:59.945 "rw_mbytes_per_sec": 0, 00:11:59.945 "r_mbytes_per_sec": 0, 00:11:59.945 "w_mbytes_per_sec": 0 00:11:59.945 }, 00:11:59.945 "claimed": true, 00:11:59.945 "claim_type": "exclusive_write", 00:11:59.945 "zoned": false, 00:11:59.945 "supported_io_types": { 00:11:59.945 "read": true, 00:11:59.945 "write": true, 00:11:59.945 "unmap": true, 00:11:59.945 "flush": true, 00:11:59.945 "reset": true, 00:11:59.945 "nvme_admin": false, 00:11:59.945 "nvme_io": false, 00:11:59.945 "nvme_io_md": false, 00:11:59.945 "write_zeroes": true, 00:11:59.945 "zcopy": true, 00:11:59.945 "get_zone_info": false, 00:11:59.945 "zone_management": false, 00:11:59.945 "zone_append": false, 00:11:59.945 "compare": false, 00:11:59.945 "compare_and_write": false, 00:11:59.945 "abort": true, 00:11:59.945 "seek_hole": false, 00:11:59.945 "seek_data": false, 00:11:59.945 "copy": true, 00:11:59.945 "nvme_iov_md": false 00:11:59.945 }, 00:11:59.945 "memory_domains": [ 00:11:59.945 { 00:11:59.945 "dma_device_id": "system", 00:11:59.945 "dma_device_type": 1 00:11:59.945 }, 00:11:59.945 { 00:11:59.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.945 "dma_device_type": 2 00:11:59.945 } 00:11:59.945 ], 00:11:59.945 "driver_specific": {} 00:11:59.945 } 00:11:59.945 ] 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.945 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.946 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.946 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.215 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.215 "name": "Existed_Raid", 00:12:00.215 "uuid": "b836429e-48c8-42ef-acf4-5f16d9556a5b", 00:12:00.215 "strip_size_kb": 64, 00:12:00.215 "state": "configuring", 00:12:00.215 "raid_level": "raid0", 00:12:00.215 "superblock": true, 00:12:00.215 "num_base_bdevs": 2, 00:12:00.215 "num_base_bdevs_discovered": 1, 00:12:00.215 "num_base_bdevs_operational": 2, 00:12:00.215 "base_bdevs_list": [ 00:12:00.215 { 00:12:00.215 "name": "BaseBdev1", 00:12:00.215 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:12:00.215 "is_configured": true, 00:12:00.215 "data_offset": 2048, 00:12:00.215 "data_size": 63488 00:12:00.215 }, 00:12:00.215 { 00:12:00.215 "name": "BaseBdev2", 00:12:00.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.215 "is_configured": false, 00:12:00.215 "data_offset": 0, 00:12:00.215 "data_size": 0 00:12:00.215 } 00:12:00.215 ] 00:12:00.215 }' 00:12:00.215 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.215 22:19:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:00.780 22:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:01.038 [2024-07-12 22:19:11.162569] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:01.038 [2024-07-12 22:19:11.162615] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x135e350 name Existed_Raid, state configuring 00:12:01.038 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:01.297 [2024-07-12 22:19:11.407266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:01.297 [2024-07-12 22:19:11.408759] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:01.297 [2024-07-12 22:19:11.408792] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.297 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:01.556 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.556 "name": "Existed_Raid", 00:12:01.556 "uuid": "2a972271-d7d7-4b52-8db2-e339c2440207", 00:12:01.556 "strip_size_kb": 64, 00:12:01.556 "state": "configuring", 00:12:01.556 "raid_level": "raid0", 00:12:01.556 "superblock": true, 00:12:01.556 "num_base_bdevs": 2, 00:12:01.556 "num_base_bdevs_discovered": 1, 00:12:01.556 "num_base_bdevs_operational": 2, 00:12:01.556 "base_bdevs_list": [ 00:12:01.556 { 00:12:01.556 "name": "BaseBdev1", 00:12:01.556 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:12:01.556 "is_configured": true, 00:12:01.556 "data_offset": 2048, 00:12:01.556 "data_size": 63488 00:12:01.556 }, 00:12:01.556 { 00:12:01.556 "name": "BaseBdev2", 00:12:01.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.556 "is_configured": false, 00:12:01.556 "data_offset": 0, 00:12:01.556 "data_size": 0 00:12:01.556 } 00:12:01.556 ] 00:12:01.556 }' 00:12:01.556 22:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.556 22:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:02.123 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:02.381 [2024-07-12 22:19:12.489711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:02.381 [2024-07-12 22:19:12.489874] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x135f000 00:12:02.382 [2024-07-12 22:19:12.489888] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:02.382 [2024-07-12 22:19:12.490080] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12790c0 00:12:02.382 [2024-07-12 22:19:12.490196] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x135f000 00:12:02.382 [2024-07-12 22:19:12.490206] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x135f000 00:12:02.382 [2024-07-12 22:19:12.490302] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.382 BaseBdev2 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:02.382 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:02.650 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:02.650 [ 00:12:02.650 { 00:12:02.650 "name": "BaseBdev2", 00:12:02.650 "aliases": [ 00:12:02.650 "e5d11e3d-67f4-4bce-a2d4-411a91811d13" 00:12:02.650 ], 00:12:02.650 "product_name": "Malloc disk", 00:12:02.650 "block_size": 512, 00:12:02.650 "num_blocks": 65536, 00:12:02.650 "uuid": "e5d11e3d-67f4-4bce-a2d4-411a91811d13", 00:12:02.650 "assigned_rate_limits": { 00:12:02.650 "rw_ios_per_sec": 0, 00:12:02.650 "rw_mbytes_per_sec": 0, 00:12:02.650 "r_mbytes_per_sec": 0, 00:12:02.650 "w_mbytes_per_sec": 0 00:12:02.650 }, 00:12:02.650 "claimed": true, 00:12:02.650 "claim_type": "exclusive_write", 00:12:02.650 "zoned": false, 00:12:02.650 "supported_io_types": { 00:12:02.650 "read": true, 00:12:02.650 "write": true, 00:12:02.650 "unmap": true, 00:12:02.650 "flush": true, 00:12:02.650 "reset": true, 00:12:02.650 "nvme_admin": false, 00:12:02.650 "nvme_io": false, 00:12:02.650 "nvme_io_md": false, 00:12:02.650 "write_zeroes": true, 00:12:02.650 "zcopy": true, 00:12:02.650 "get_zone_info": false, 00:12:02.650 "zone_management": false, 00:12:02.650 "zone_append": false, 00:12:02.650 "compare": false, 00:12:02.650 "compare_and_write": false, 00:12:02.650 "abort": true, 00:12:02.650 "seek_hole": false, 00:12:02.650 "seek_data": false, 00:12:02.650 "copy": true, 00:12:02.650 "nvme_iov_md": false 00:12:02.650 }, 00:12:02.650 "memory_domains": [ 00:12:02.650 { 00:12:02.650 "dma_device_id": "system", 00:12:02.650 "dma_device_type": 1 00:12:02.650 }, 00:12:02.650 { 00:12:02.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.650 "dma_device_type": 2 00:12:02.650 } 00:12:02.650 ], 00:12:02.650 "driver_specific": {} 00:12:02.650 } 00:12:02.650 ] 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.915 22:19:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.915 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.915 "name": "Existed_Raid", 00:12:02.915 "uuid": "2a972271-d7d7-4b52-8db2-e339c2440207", 00:12:02.915 "strip_size_kb": 64, 00:12:02.915 "state": "online", 00:12:02.915 "raid_level": "raid0", 00:12:02.915 "superblock": true, 00:12:02.915 "num_base_bdevs": 2, 00:12:02.915 "num_base_bdevs_discovered": 2, 00:12:02.915 "num_base_bdevs_operational": 2, 00:12:02.915 "base_bdevs_list": [ 00:12:02.915 { 00:12:02.915 "name": "BaseBdev1", 00:12:02.915 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:12:02.915 "is_configured": true, 00:12:02.915 "data_offset": 2048, 00:12:02.915 "data_size": 63488 00:12:02.915 }, 00:12:02.915 { 00:12:02.915 "name": "BaseBdev2", 00:12:02.915 "uuid": "e5d11e3d-67f4-4bce-a2d4-411a91811d13", 00:12:02.915 "is_configured": true, 00:12:02.915 "data_offset": 2048, 00:12:02.915 "data_size": 63488 00:12:02.915 } 00:12:02.915 ] 00:12:02.915 }' 00:12:02.915 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.915 22:19:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:03.483 22:19:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:03.742 [2024-07-12 22:19:13.981976] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:03.742 "name": "Existed_Raid", 00:12:03.742 "aliases": [ 00:12:03.742 "2a972271-d7d7-4b52-8db2-e339c2440207" 00:12:03.742 ], 00:12:03.742 "product_name": "Raid Volume", 00:12:03.742 "block_size": 512, 00:12:03.742 "num_blocks": 126976, 00:12:03.742 "uuid": "2a972271-d7d7-4b52-8db2-e339c2440207", 00:12:03.742 "assigned_rate_limits": { 00:12:03.742 "rw_ios_per_sec": 0, 00:12:03.742 "rw_mbytes_per_sec": 0, 00:12:03.742 "r_mbytes_per_sec": 0, 00:12:03.742 "w_mbytes_per_sec": 0 00:12:03.742 }, 00:12:03.742 "claimed": false, 00:12:03.742 "zoned": false, 00:12:03.742 "supported_io_types": { 00:12:03.742 "read": true, 00:12:03.742 "write": true, 00:12:03.742 "unmap": true, 00:12:03.742 "flush": true, 00:12:03.742 "reset": true, 00:12:03.742 "nvme_admin": false, 00:12:03.742 "nvme_io": false, 00:12:03.742 "nvme_io_md": false, 00:12:03.742 "write_zeroes": true, 00:12:03.742 "zcopy": false, 00:12:03.742 "get_zone_info": false, 00:12:03.742 "zone_management": false, 00:12:03.742 "zone_append": false, 00:12:03.742 "compare": false, 00:12:03.742 "compare_and_write": false, 00:12:03.742 "abort": false, 00:12:03.742 "seek_hole": false, 00:12:03.742 "seek_data": false, 00:12:03.742 "copy": false, 00:12:03.742 "nvme_iov_md": false 00:12:03.742 }, 00:12:03.742 "memory_domains": [ 00:12:03.742 { 00:12:03.742 "dma_device_id": "system", 00:12:03.742 "dma_device_type": 1 00:12:03.742 }, 00:12:03.742 { 00:12:03.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.742 "dma_device_type": 2 00:12:03.742 }, 00:12:03.742 { 00:12:03.742 "dma_device_id": "system", 00:12:03.742 "dma_device_type": 1 00:12:03.742 }, 00:12:03.742 { 00:12:03.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.742 "dma_device_type": 2 00:12:03.742 } 00:12:03.742 ], 00:12:03.742 "driver_specific": { 00:12:03.742 "raid": { 00:12:03.742 "uuid": "2a972271-d7d7-4b52-8db2-e339c2440207", 00:12:03.742 "strip_size_kb": 64, 00:12:03.742 "state": "online", 00:12:03.742 "raid_level": "raid0", 00:12:03.742 "superblock": true, 00:12:03.742 "num_base_bdevs": 2, 00:12:03.742 "num_base_bdevs_discovered": 2, 00:12:03.742 "num_base_bdevs_operational": 2, 00:12:03.742 "base_bdevs_list": [ 00:12:03.742 { 00:12:03.742 "name": "BaseBdev1", 00:12:03.742 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:12:03.742 "is_configured": true, 00:12:03.742 "data_offset": 2048, 00:12:03.742 "data_size": 63488 00:12:03.742 }, 00:12:03.742 { 00:12:03.742 "name": "BaseBdev2", 00:12:03.742 "uuid": "e5d11e3d-67f4-4bce-a2d4-411a91811d13", 00:12:03.742 "is_configured": true, 00:12:03.742 "data_offset": 2048, 00:12:03.742 "data_size": 63488 00:12:03.742 } 00:12:03.742 ] 00:12:03.742 } 00:12:03.742 } 00:12:03.742 }' 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:03.742 BaseBdev2' 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:03.742 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.000 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.000 "name": "BaseBdev1", 00:12:04.000 "aliases": [ 00:12:04.000 "0706606c-8f12-428e-88f4-5cb11cc62c3a" 00:12:04.000 ], 00:12:04.000 "product_name": "Malloc disk", 00:12:04.000 "block_size": 512, 00:12:04.000 "num_blocks": 65536, 00:12:04.000 "uuid": "0706606c-8f12-428e-88f4-5cb11cc62c3a", 00:12:04.000 "assigned_rate_limits": { 00:12:04.000 "rw_ios_per_sec": 0, 00:12:04.000 "rw_mbytes_per_sec": 0, 00:12:04.000 "r_mbytes_per_sec": 0, 00:12:04.000 "w_mbytes_per_sec": 0 00:12:04.000 }, 00:12:04.001 "claimed": true, 00:12:04.001 "claim_type": "exclusive_write", 00:12:04.001 "zoned": false, 00:12:04.001 "supported_io_types": { 00:12:04.001 "read": true, 00:12:04.001 "write": true, 00:12:04.001 "unmap": true, 00:12:04.001 "flush": true, 00:12:04.001 "reset": true, 00:12:04.001 "nvme_admin": false, 00:12:04.001 "nvme_io": false, 00:12:04.001 "nvme_io_md": false, 00:12:04.001 "write_zeroes": true, 00:12:04.001 "zcopy": true, 00:12:04.001 "get_zone_info": false, 00:12:04.001 "zone_management": false, 00:12:04.001 "zone_append": false, 00:12:04.001 "compare": false, 00:12:04.001 "compare_and_write": false, 00:12:04.001 "abort": true, 00:12:04.001 "seek_hole": false, 00:12:04.001 "seek_data": false, 00:12:04.001 "copy": true, 00:12:04.001 "nvme_iov_md": false 00:12:04.001 }, 00:12:04.001 "memory_domains": [ 00:12:04.001 { 00:12:04.001 "dma_device_id": "system", 00:12:04.001 "dma_device_type": 1 00:12:04.001 }, 00:12:04.001 { 00:12:04.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.001 "dma_device_type": 2 00:12:04.001 } 00:12:04.001 ], 00:12:04.001 "driver_specific": {} 00:12:04.001 }' 00:12:04.001 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.259 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.518 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.518 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.518 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.518 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.518 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.776 "name": "BaseBdev2", 00:12:04.776 "aliases": [ 00:12:04.776 "e5d11e3d-67f4-4bce-a2d4-411a91811d13" 00:12:04.776 ], 00:12:04.776 "product_name": "Malloc disk", 00:12:04.776 "block_size": 512, 00:12:04.776 "num_blocks": 65536, 00:12:04.776 "uuid": "e5d11e3d-67f4-4bce-a2d4-411a91811d13", 00:12:04.776 "assigned_rate_limits": { 00:12:04.776 "rw_ios_per_sec": 0, 00:12:04.776 "rw_mbytes_per_sec": 0, 00:12:04.776 "r_mbytes_per_sec": 0, 00:12:04.776 "w_mbytes_per_sec": 0 00:12:04.776 }, 00:12:04.776 "claimed": true, 00:12:04.776 "claim_type": "exclusive_write", 00:12:04.776 "zoned": false, 00:12:04.776 "supported_io_types": { 00:12:04.776 "read": true, 00:12:04.776 "write": true, 00:12:04.776 "unmap": true, 00:12:04.776 "flush": true, 00:12:04.776 "reset": true, 00:12:04.776 "nvme_admin": false, 00:12:04.776 "nvme_io": false, 00:12:04.776 "nvme_io_md": false, 00:12:04.776 "write_zeroes": true, 00:12:04.776 "zcopy": true, 00:12:04.776 "get_zone_info": false, 00:12:04.776 "zone_management": false, 00:12:04.776 "zone_append": false, 00:12:04.776 "compare": false, 00:12:04.776 "compare_and_write": false, 00:12:04.776 "abort": true, 00:12:04.776 "seek_hole": false, 00:12:04.776 "seek_data": false, 00:12:04.776 "copy": true, 00:12:04.776 "nvme_iov_md": false 00:12:04.776 }, 00:12:04.776 "memory_domains": [ 00:12:04.776 { 00:12:04.776 "dma_device_id": "system", 00:12:04.776 "dma_device_type": 1 00:12:04.776 }, 00:12:04.776 { 00:12:04.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.776 "dma_device_type": 2 00:12:04.776 } 00:12:04.776 ], 00:12:04.776 "driver_specific": {} 00:12:04.776 }' 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.776 22:19:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.776 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.776 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.776 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.776 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.776 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.034 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.034 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:05.034 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:05.292 [2024-07-12 22:19:15.409529] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:05.292 [2024-07-12 22:19:15.409554] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:05.292 [2024-07-12 22:19:15.409597] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.292 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.551 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.551 "name": "Existed_Raid", 00:12:05.551 "uuid": "2a972271-d7d7-4b52-8db2-e339c2440207", 00:12:05.551 "strip_size_kb": 64, 00:12:05.551 "state": "offline", 00:12:05.551 "raid_level": "raid0", 00:12:05.551 "superblock": true, 00:12:05.551 "num_base_bdevs": 2, 00:12:05.551 "num_base_bdevs_discovered": 1, 00:12:05.551 "num_base_bdevs_operational": 1, 00:12:05.551 "base_bdevs_list": [ 00:12:05.551 { 00:12:05.551 "name": null, 00:12:05.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.551 "is_configured": false, 00:12:05.551 "data_offset": 2048, 00:12:05.551 "data_size": 63488 00:12:05.551 }, 00:12:05.551 { 00:12:05.551 "name": "BaseBdev2", 00:12:05.551 "uuid": "e5d11e3d-67f4-4bce-a2d4-411a91811d13", 00:12:05.551 "is_configured": true, 00:12:05.551 "data_offset": 2048, 00:12:05.551 "data_size": 63488 00:12:05.551 } 00:12:05.551 ] 00:12:05.551 }' 00:12:05.551 22:19:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.551 22:19:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:06.118 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:06.118 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.118 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.118 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:06.686 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:06.686 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:06.686 22:19:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:06.687 [2024-07-12 22:19:17.002779] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:06.687 [2024-07-12 22:19:17.002833] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x135f000 name Existed_Raid, state offline 00:12:06.945 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:06.945 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.945 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.945 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3422316 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3422316 ']' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3422316 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3422316 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3422316' 00:12:07.204 killing process with pid 3422316 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3422316 00:12:07.204 [2024-07-12 22:19:17.323494] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:07.204 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3422316 00:12:07.204 [2024-07-12 22:19:17.324398] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:07.463 22:19:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:07.463 00:12:07.463 real 0m10.702s 00:12:07.463 user 0m19.045s 00:12:07.463 sys 0m1.955s 00:12:07.463 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:07.463 22:19:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.463 ************************************ 00:12:07.463 END TEST raid_state_function_test_sb 00:12:07.463 ************************************ 00:12:07.463 22:19:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.463 22:19:17 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:12:07.463 22:19:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:07.463 22:19:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.463 22:19:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.463 ************************************ 00:12:07.463 START TEST raid_superblock_test 00:12:07.463 ************************************ 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3423997 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3423997 /var/tmp/spdk-raid.sock 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3423997 ']' 00:12:07.463 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.464 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.464 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.464 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.464 22:19:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.464 [2024-07-12 22:19:17.695349] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:07.464 [2024-07-12 22:19:17.695416] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3423997 ] 00:12:07.723 [2024-07-12 22:19:17.823255] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.723 [2024-07-12 22:19:17.929062] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.723 [2024-07-12 22:19:17.998030] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.723 [2024-07-12 22:19:17.998067] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.290 22:19:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:08.290 22:19:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:08.551 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:08.551 malloc1 00:12:08.552 22:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:08.846 [2024-07-12 22:19:19.081965] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:08.846 [2024-07-12 22:19:19.082015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:08.846 [2024-07-12 22:19:19.082038] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ced570 00:12:08.846 [2024-07-12 22:19:19.082050] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:08.846 [2024-07-12 22:19:19.083807] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:08.846 [2024-07-12 22:19:19.083839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:08.846 pt1 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:08.846 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:09.105 malloc2 00:12:09.105 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:09.364 [2024-07-12 22:19:19.577307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:09.364 [2024-07-12 22:19:19.577356] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.364 [2024-07-12 22:19:19.577375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cee970 00:12:09.364 [2024-07-12 22:19:19.577388] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.364 [2024-07-12 22:19:19.579053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.364 [2024-07-12 22:19:19.579082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:09.364 pt2 00:12:09.364 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:09.364 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:09.364 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:09.623 [2024-07-12 22:19:19.821990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:09.623 [2024-07-12 22:19:19.823373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:09.623 [2024-07-12 22:19:19.823524] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e91270 00:12:09.623 [2024-07-12 22:19:19.823537] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:09.623 [2024-07-12 22:19:19.823737] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e86c10 00:12:09.623 [2024-07-12 22:19:19.823889] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e91270 00:12:09.623 [2024-07-12 22:19:19.823899] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e91270 00:12:09.623 [2024-07-12 22:19:19.824016] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.623 22:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.883 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.883 "name": "raid_bdev1", 00:12:09.883 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:09.883 "strip_size_kb": 64, 00:12:09.883 "state": "online", 00:12:09.883 "raid_level": "raid0", 00:12:09.883 "superblock": true, 00:12:09.883 "num_base_bdevs": 2, 00:12:09.883 "num_base_bdevs_discovered": 2, 00:12:09.883 "num_base_bdevs_operational": 2, 00:12:09.883 "base_bdevs_list": [ 00:12:09.883 { 00:12:09.883 "name": "pt1", 00:12:09.883 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:09.883 "is_configured": true, 00:12:09.883 "data_offset": 2048, 00:12:09.883 "data_size": 63488 00:12:09.883 }, 00:12:09.883 { 00:12:09.883 "name": "pt2", 00:12:09.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:09.883 "is_configured": true, 00:12:09.883 "data_offset": 2048, 00:12:09.883 "data_size": 63488 00:12:09.883 } 00:12:09.883 ] 00:12:09.883 }' 00:12:09.883 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.883 22:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:10.452 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:10.712 [2024-07-12 22:19:20.901087] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:10.712 "name": "raid_bdev1", 00:12:10.712 "aliases": [ 00:12:10.712 "c862d237-6467-485b-b351-36af06a29f2a" 00:12:10.712 ], 00:12:10.712 "product_name": "Raid Volume", 00:12:10.712 "block_size": 512, 00:12:10.712 "num_blocks": 126976, 00:12:10.712 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:10.712 "assigned_rate_limits": { 00:12:10.712 "rw_ios_per_sec": 0, 00:12:10.712 "rw_mbytes_per_sec": 0, 00:12:10.712 "r_mbytes_per_sec": 0, 00:12:10.712 "w_mbytes_per_sec": 0 00:12:10.712 }, 00:12:10.712 "claimed": false, 00:12:10.712 "zoned": false, 00:12:10.712 "supported_io_types": { 00:12:10.712 "read": true, 00:12:10.712 "write": true, 00:12:10.712 "unmap": true, 00:12:10.712 "flush": true, 00:12:10.712 "reset": true, 00:12:10.712 "nvme_admin": false, 00:12:10.712 "nvme_io": false, 00:12:10.712 "nvme_io_md": false, 00:12:10.712 "write_zeroes": true, 00:12:10.712 "zcopy": false, 00:12:10.712 "get_zone_info": false, 00:12:10.712 "zone_management": false, 00:12:10.712 "zone_append": false, 00:12:10.712 "compare": false, 00:12:10.712 "compare_and_write": false, 00:12:10.712 "abort": false, 00:12:10.712 "seek_hole": false, 00:12:10.712 "seek_data": false, 00:12:10.712 "copy": false, 00:12:10.712 "nvme_iov_md": false 00:12:10.712 }, 00:12:10.712 "memory_domains": [ 00:12:10.712 { 00:12:10.712 "dma_device_id": "system", 00:12:10.712 "dma_device_type": 1 00:12:10.712 }, 00:12:10.712 { 00:12:10.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.712 "dma_device_type": 2 00:12:10.712 }, 00:12:10.712 { 00:12:10.712 "dma_device_id": "system", 00:12:10.712 "dma_device_type": 1 00:12:10.712 }, 00:12:10.712 { 00:12:10.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.712 "dma_device_type": 2 00:12:10.712 } 00:12:10.712 ], 00:12:10.712 "driver_specific": { 00:12:10.712 "raid": { 00:12:10.712 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:10.712 "strip_size_kb": 64, 00:12:10.712 "state": "online", 00:12:10.712 "raid_level": "raid0", 00:12:10.712 "superblock": true, 00:12:10.712 "num_base_bdevs": 2, 00:12:10.712 "num_base_bdevs_discovered": 2, 00:12:10.712 "num_base_bdevs_operational": 2, 00:12:10.712 "base_bdevs_list": [ 00:12:10.712 { 00:12:10.712 "name": "pt1", 00:12:10.712 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:10.712 "is_configured": true, 00:12:10.712 "data_offset": 2048, 00:12:10.712 "data_size": 63488 00:12:10.712 }, 00:12:10.712 { 00:12:10.712 "name": "pt2", 00:12:10.712 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:10.712 "is_configured": true, 00:12:10.712 "data_offset": 2048, 00:12:10.712 "data_size": 63488 00:12:10.712 } 00:12:10.712 ] 00:12:10.712 } 00:12:10.712 } 00:12:10.712 }' 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:10.712 pt2' 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:10.712 22:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.971 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.971 "name": "pt1", 00:12:10.971 "aliases": [ 00:12:10.971 "00000000-0000-0000-0000-000000000001" 00:12:10.971 ], 00:12:10.971 "product_name": "passthru", 00:12:10.971 "block_size": 512, 00:12:10.971 "num_blocks": 65536, 00:12:10.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:10.971 "assigned_rate_limits": { 00:12:10.971 "rw_ios_per_sec": 0, 00:12:10.971 "rw_mbytes_per_sec": 0, 00:12:10.971 "r_mbytes_per_sec": 0, 00:12:10.971 "w_mbytes_per_sec": 0 00:12:10.971 }, 00:12:10.971 "claimed": true, 00:12:10.971 "claim_type": "exclusive_write", 00:12:10.971 "zoned": false, 00:12:10.971 "supported_io_types": { 00:12:10.971 "read": true, 00:12:10.971 "write": true, 00:12:10.971 "unmap": true, 00:12:10.971 "flush": true, 00:12:10.971 "reset": true, 00:12:10.971 "nvme_admin": false, 00:12:10.971 "nvme_io": false, 00:12:10.971 "nvme_io_md": false, 00:12:10.971 "write_zeroes": true, 00:12:10.971 "zcopy": true, 00:12:10.971 "get_zone_info": false, 00:12:10.971 "zone_management": false, 00:12:10.971 "zone_append": false, 00:12:10.971 "compare": false, 00:12:10.971 "compare_and_write": false, 00:12:10.971 "abort": true, 00:12:10.971 "seek_hole": false, 00:12:10.971 "seek_data": false, 00:12:10.971 "copy": true, 00:12:10.971 "nvme_iov_md": false 00:12:10.971 }, 00:12:10.971 "memory_domains": [ 00:12:10.971 { 00:12:10.971 "dma_device_id": "system", 00:12:10.971 "dma_device_type": 1 00:12:10.971 }, 00:12:10.971 { 00:12:10.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.971 "dma_device_type": 2 00:12:10.971 } 00:12:10.971 ], 00:12:10.971 "driver_specific": { 00:12:10.971 "passthru": { 00:12:10.971 "name": "pt1", 00:12:10.971 "base_bdev_name": "malloc1" 00:12:10.971 } 00:12:10.971 } 00:12:10.971 }' 00:12:10.971 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.971 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.230 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.490 "name": "pt2", 00:12:11.490 "aliases": [ 00:12:11.490 "00000000-0000-0000-0000-000000000002" 00:12:11.490 ], 00:12:11.490 "product_name": "passthru", 00:12:11.490 "block_size": 512, 00:12:11.490 "num_blocks": 65536, 00:12:11.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:11.490 "assigned_rate_limits": { 00:12:11.490 "rw_ios_per_sec": 0, 00:12:11.490 "rw_mbytes_per_sec": 0, 00:12:11.490 "r_mbytes_per_sec": 0, 00:12:11.490 "w_mbytes_per_sec": 0 00:12:11.490 }, 00:12:11.490 "claimed": true, 00:12:11.490 "claim_type": "exclusive_write", 00:12:11.490 "zoned": false, 00:12:11.490 "supported_io_types": { 00:12:11.490 "read": true, 00:12:11.490 "write": true, 00:12:11.490 "unmap": true, 00:12:11.490 "flush": true, 00:12:11.490 "reset": true, 00:12:11.490 "nvme_admin": false, 00:12:11.490 "nvme_io": false, 00:12:11.490 "nvme_io_md": false, 00:12:11.490 "write_zeroes": true, 00:12:11.490 "zcopy": true, 00:12:11.490 "get_zone_info": false, 00:12:11.490 "zone_management": false, 00:12:11.490 "zone_append": false, 00:12:11.490 "compare": false, 00:12:11.490 "compare_and_write": false, 00:12:11.490 "abort": true, 00:12:11.490 "seek_hole": false, 00:12:11.490 "seek_data": false, 00:12:11.490 "copy": true, 00:12:11.490 "nvme_iov_md": false 00:12:11.490 }, 00:12:11.490 "memory_domains": [ 00:12:11.490 { 00:12:11.490 "dma_device_id": "system", 00:12:11.490 "dma_device_type": 1 00:12:11.490 }, 00:12:11.490 { 00:12:11.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.490 "dma_device_type": 2 00:12:11.490 } 00:12:11.490 ], 00:12:11.490 "driver_specific": { 00:12:11.490 "passthru": { 00:12:11.490 "name": "pt2", 00:12:11.490 "base_bdev_name": "malloc2" 00:12:11.490 } 00:12:11.490 } 00:12:11.490 }' 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.490 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.749 22:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.749 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.749 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.749 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.008 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.008 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:12.008 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:12.266 [2024-07-12 22:19:22.340857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.266 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c862d237-6467-485b-b351-36af06a29f2a 00:12:12.266 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c862d237-6467-485b-b351-36af06a29f2a ']' 00:12:12.266 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.266 [2024-07-12 22:19:22.589279] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.266 [2024-07-12 22:19:22.589302] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.266 [2024-07-12 22:19:22.589360] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.266 [2024-07-12 22:19:22.589405] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.266 [2024-07-12 22:19:22.589417] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e91270 name raid_bdev1, state offline 00:12:12.525 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.525 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:12.783 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:12.783 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:12.783 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:12.783 22:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:12.783 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:12.783 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:13.042 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:13.042 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:13.300 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.568 [2024-07-12 22:19:23.816561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:13.569 [2024-07-12 22:19:23.817953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:13.569 [2024-07-12 22:19:23.818011] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:13.569 [2024-07-12 22:19:23.818053] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:13.569 [2024-07-12 22:19:23.818073] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:13.569 [2024-07-12 22:19:23.818083] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e90ff0 name raid_bdev1, state configuring 00:12:13.569 request: 00:12:13.569 { 00:12:13.569 "name": "raid_bdev1", 00:12:13.569 "raid_level": "raid0", 00:12:13.569 "base_bdevs": [ 00:12:13.569 "malloc1", 00:12:13.569 "malloc2" 00:12:13.569 ], 00:12:13.569 "strip_size_kb": 64, 00:12:13.569 "superblock": false, 00:12:13.569 "method": "bdev_raid_create", 00:12:13.569 "req_id": 1 00:12:13.569 } 00:12:13.569 Got JSON-RPC error response 00:12:13.569 response: 00:12:13.569 { 00:12:13.569 "code": -17, 00:12:13.569 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:13.569 } 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.569 22:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:13.831 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:13.831 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:13.831 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:14.089 [2024-07-12 22:19:24.309797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:14.089 [2024-07-12 22:19:24.309842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.089 [2024-07-12 22:19:24.309864] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ced7a0 00:12:14.089 [2024-07-12 22:19:24.309877] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.089 [2024-07-12 22:19:24.311514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.089 [2024-07-12 22:19:24.311542] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:14.089 [2024-07-12 22:19:24.311607] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:14.089 [2024-07-12 22:19:24.311633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:14.089 pt1 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.089 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.090 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.348 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.348 "name": "raid_bdev1", 00:12:14.348 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:14.348 "strip_size_kb": 64, 00:12:14.348 "state": "configuring", 00:12:14.348 "raid_level": "raid0", 00:12:14.348 "superblock": true, 00:12:14.348 "num_base_bdevs": 2, 00:12:14.348 "num_base_bdevs_discovered": 1, 00:12:14.348 "num_base_bdevs_operational": 2, 00:12:14.348 "base_bdevs_list": [ 00:12:14.348 { 00:12:14.348 "name": "pt1", 00:12:14.348 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:14.348 "is_configured": true, 00:12:14.348 "data_offset": 2048, 00:12:14.348 "data_size": 63488 00:12:14.348 }, 00:12:14.348 { 00:12:14.348 "name": null, 00:12:14.348 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.348 "is_configured": false, 00:12:14.348 "data_offset": 2048, 00:12:14.348 "data_size": 63488 00:12:14.348 } 00:12:14.348 ] 00:12:14.348 }' 00:12:14.348 22:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.348 22:19:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.915 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:14.915 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:14.915 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:14.915 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:15.173 [2024-07-12 22:19:25.332540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:15.173 [2024-07-12 22:19:25.332586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.173 [2024-07-12 22:19:25.332605] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e87820 00:12:15.173 [2024-07-12 22:19:25.332617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.173 [2024-07-12 22:19:25.332976] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.173 [2024-07-12 22:19:25.332995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:15.173 [2024-07-12 22:19:25.333055] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:15.173 [2024-07-12 22:19:25.333075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:15.173 [2024-07-12 22:19:25.333170] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce3ec0 00:12:15.173 [2024-07-12 22:19:25.333186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:15.173 [2024-07-12 22:19:25.333354] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ce6530 00:12:15.173 [2024-07-12 22:19:25.333475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce3ec0 00:12:15.173 [2024-07-12 22:19:25.333485] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce3ec0 00:12:15.173 [2024-07-12 22:19:25.333582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.173 pt2 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.173 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.431 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.431 "name": "raid_bdev1", 00:12:15.431 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:15.431 "strip_size_kb": 64, 00:12:15.431 "state": "online", 00:12:15.431 "raid_level": "raid0", 00:12:15.431 "superblock": true, 00:12:15.431 "num_base_bdevs": 2, 00:12:15.431 "num_base_bdevs_discovered": 2, 00:12:15.431 "num_base_bdevs_operational": 2, 00:12:15.431 "base_bdevs_list": [ 00:12:15.431 { 00:12:15.431 "name": "pt1", 00:12:15.431 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:15.431 "is_configured": true, 00:12:15.431 "data_offset": 2048, 00:12:15.431 "data_size": 63488 00:12:15.431 }, 00:12:15.431 { 00:12:15.431 "name": "pt2", 00:12:15.431 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:15.431 "is_configured": true, 00:12:15.431 "data_offset": 2048, 00:12:15.431 "data_size": 63488 00:12:15.431 } 00:12:15.431 ] 00:12:15.431 }' 00:12:15.431 22:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.431 22:19:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:15.998 [2024-07-12 22:19:26.295338] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:15.998 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:15.998 "name": "raid_bdev1", 00:12:15.998 "aliases": [ 00:12:15.998 "c862d237-6467-485b-b351-36af06a29f2a" 00:12:15.998 ], 00:12:15.998 "product_name": "Raid Volume", 00:12:15.998 "block_size": 512, 00:12:15.998 "num_blocks": 126976, 00:12:15.998 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:15.998 "assigned_rate_limits": { 00:12:15.998 "rw_ios_per_sec": 0, 00:12:15.998 "rw_mbytes_per_sec": 0, 00:12:15.998 "r_mbytes_per_sec": 0, 00:12:15.998 "w_mbytes_per_sec": 0 00:12:15.998 }, 00:12:15.998 "claimed": false, 00:12:15.998 "zoned": false, 00:12:15.998 "supported_io_types": { 00:12:15.998 "read": true, 00:12:15.998 "write": true, 00:12:15.998 "unmap": true, 00:12:15.998 "flush": true, 00:12:15.998 "reset": true, 00:12:15.998 "nvme_admin": false, 00:12:15.998 "nvme_io": false, 00:12:15.998 "nvme_io_md": false, 00:12:15.998 "write_zeroes": true, 00:12:15.998 "zcopy": false, 00:12:15.998 "get_zone_info": false, 00:12:15.998 "zone_management": false, 00:12:15.998 "zone_append": false, 00:12:15.998 "compare": false, 00:12:15.998 "compare_and_write": false, 00:12:15.998 "abort": false, 00:12:15.998 "seek_hole": false, 00:12:15.998 "seek_data": false, 00:12:15.998 "copy": false, 00:12:15.998 "nvme_iov_md": false 00:12:15.998 }, 00:12:15.998 "memory_domains": [ 00:12:15.998 { 00:12:15.998 "dma_device_id": "system", 00:12:15.998 "dma_device_type": 1 00:12:15.998 }, 00:12:15.999 { 00:12:15.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.999 "dma_device_type": 2 00:12:15.999 }, 00:12:15.999 { 00:12:15.999 "dma_device_id": "system", 00:12:15.999 "dma_device_type": 1 00:12:15.999 }, 00:12:15.999 { 00:12:15.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.999 "dma_device_type": 2 00:12:15.999 } 00:12:15.999 ], 00:12:15.999 "driver_specific": { 00:12:15.999 "raid": { 00:12:15.999 "uuid": "c862d237-6467-485b-b351-36af06a29f2a", 00:12:15.999 "strip_size_kb": 64, 00:12:15.999 "state": "online", 00:12:15.999 "raid_level": "raid0", 00:12:15.999 "superblock": true, 00:12:15.999 "num_base_bdevs": 2, 00:12:15.999 "num_base_bdevs_discovered": 2, 00:12:15.999 "num_base_bdevs_operational": 2, 00:12:15.999 "base_bdevs_list": [ 00:12:15.999 { 00:12:15.999 "name": "pt1", 00:12:15.999 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:15.999 "is_configured": true, 00:12:15.999 "data_offset": 2048, 00:12:15.999 "data_size": 63488 00:12:15.999 }, 00:12:15.999 { 00:12:15.999 "name": "pt2", 00:12:15.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:15.999 "is_configured": true, 00:12:15.999 "data_offset": 2048, 00:12:15.999 "data_size": 63488 00:12:15.999 } 00:12:15.999 ] 00:12:15.999 } 00:12:15.999 } 00:12:15.999 }' 00:12:15.999 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:16.257 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:16.257 pt2' 00:12:16.257 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:16.257 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:16.257 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:16.516 "name": "pt1", 00:12:16.516 "aliases": [ 00:12:16.516 "00000000-0000-0000-0000-000000000001" 00:12:16.516 ], 00:12:16.516 "product_name": "passthru", 00:12:16.516 "block_size": 512, 00:12:16.516 "num_blocks": 65536, 00:12:16.516 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:16.516 "assigned_rate_limits": { 00:12:16.516 "rw_ios_per_sec": 0, 00:12:16.516 "rw_mbytes_per_sec": 0, 00:12:16.516 "r_mbytes_per_sec": 0, 00:12:16.516 "w_mbytes_per_sec": 0 00:12:16.516 }, 00:12:16.516 "claimed": true, 00:12:16.516 "claim_type": "exclusive_write", 00:12:16.516 "zoned": false, 00:12:16.516 "supported_io_types": { 00:12:16.516 "read": true, 00:12:16.516 "write": true, 00:12:16.516 "unmap": true, 00:12:16.516 "flush": true, 00:12:16.516 "reset": true, 00:12:16.516 "nvme_admin": false, 00:12:16.516 "nvme_io": false, 00:12:16.516 "nvme_io_md": false, 00:12:16.516 "write_zeroes": true, 00:12:16.516 "zcopy": true, 00:12:16.516 "get_zone_info": false, 00:12:16.516 "zone_management": false, 00:12:16.516 "zone_append": false, 00:12:16.516 "compare": false, 00:12:16.516 "compare_and_write": false, 00:12:16.516 "abort": true, 00:12:16.516 "seek_hole": false, 00:12:16.516 "seek_data": false, 00:12:16.516 "copy": true, 00:12:16.516 "nvme_iov_md": false 00:12:16.516 }, 00:12:16.516 "memory_domains": [ 00:12:16.516 { 00:12:16.516 "dma_device_id": "system", 00:12:16.516 "dma_device_type": 1 00:12:16.516 }, 00:12:16.516 { 00:12:16.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.516 "dma_device_type": 2 00:12:16.516 } 00:12:16.516 ], 00:12:16.516 "driver_specific": { 00:12:16.516 "passthru": { 00:12:16.516 "name": "pt1", 00:12:16.516 "base_bdev_name": "malloc1" 00:12:16.516 } 00:12:16.516 } 00:12:16.516 }' 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.516 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:16.777 22:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:17.035 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:17.035 "name": "pt2", 00:12:17.035 "aliases": [ 00:12:17.035 "00000000-0000-0000-0000-000000000002" 00:12:17.035 ], 00:12:17.035 "product_name": "passthru", 00:12:17.035 "block_size": 512, 00:12:17.035 "num_blocks": 65536, 00:12:17.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:17.035 "assigned_rate_limits": { 00:12:17.035 "rw_ios_per_sec": 0, 00:12:17.035 "rw_mbytes_per_sec": 0, 00:12:17.035 "r_mbytes_per_sec": 0, 00:12:17.035 "w_mbytes_per_sec": 0 00:12:17.035 }, 00:12:17.035 "claimed": true, 00:12:17.035 "claim_type": "exclusive_write", 00:12:17.035 "zoned": false, 00:12:17.035 "supported_io_types": { 00:12:17.035 "read": true, 00:12:17.035 "write": true, 00:12:17.035 "unmap": true, 00:12:17.035 "flush": true, 00:12:17.035 "reset": true, 00:12:17.035 "nvme_admin": false, 00:12:17.035 "nvme_io": false, 00:12:17.035 "nvme_io_md": false, 00:12:17.035 "write_zeroes": true, 00:12:17.035 "zcopy": true, 00:12:17.035 "get_zone_info": false, 00:12:17.035 "zone_management": false, 00:12:17.035 "zone_append": false, 00:12:17.035 "compare": false, 00:12:17.035 "compare_and_write": false, 00:12:17.035 "abort": true, 00:12:17.035 "seek_hole": false, 00:12:17.035 "seek_data": false, 00:12:17.035 "copy": true, 00:12:17.035 "nvme_iov_md": false 00:12:17.035 }, 00:12:17.035 "memory_domains": [ 00:12:17.035 { 00:12:17.035 "dma_device_id": "system", 00:12:17.035 "dma_device_type": 1 00:12:17.035 }, 00:12:17.035 { 00:12:17.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.035 "dma_device_type": 2 00:12:17.035 } 00:12:17.035 ], 00:12:17.035 "driver_specific": { 00:12:17.035 "passthru": { 00:12:17.035 "name": "pt2", 00:12:17.035 "base_bdev_name": "malloc2" 00:12:17.035 } 00:12:17.035 } 00:12:17.035 }' 00:12:17.035 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.036 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.036 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:17.036 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.036 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:17.294 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:17.553 [2024-07-12 22:19:27.763228] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c862d237-6467-485b-b351-36af06a29f2a '!=' c862d237-6467-485b-b351-36af06a29f2a ']' 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3423997 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3423997 ']' 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3423997 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3423997 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3423997' 00:12:17.553 killing process with pid 3423997 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3423997 00:12:17.553 [2024-07-12 22:19:27.835053] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.553 22:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3423997 00:12:17.553 [2024-07-12 22:19:27.835107] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.553 [2024-07-12 22:19:27.835150] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.553 [2024-07-12 22:19:27.835161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce3ec0 name raid_bdev1, state offline 00:12:17.553 [2024-07-12 22:19:27.852826] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.812 22:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:17.812 00:12:17.812 real 0m10.447s 00:12:17.812 user 0m18.550s 00:12:17.812 sys 0m1.994s 00:12:17.812 22:19:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.812 22:19:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.812 ************************************ 00:12:17.812 END TEST raid_superblock_test 00:12:17.812 ************************************ 00:12:17.812 22:19:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:17.812 22:19:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:17.812 22:19:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:17.812 22:19:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.812 22:19:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:18.071 ************************************ 00:12:18.071 START TEST raid_read_error_test 00:12:18.071 ************************************ 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tn5V3zIwja 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3425486 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3425486 /var/tmp/spdk-raid.sock 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3425486 ']' 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:18.071 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:18.071 22:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.071 [2024-07-12 22:19:28.222322] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:18.071 [2024-07-12 22:19:28.222387] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3425486 ] 00:12:18.071 [2024-07-12 22:19:28.352617] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:18.330 [2024-07-12 22:19:28.461901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:18.330 [2024-07-12 22:19:28.518876] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.330 [2024-07-12 22:19:28.518908] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.896 22:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:18.896 22:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:18.896 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.896 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:19.154 BaseBdev1_malloc 00:12:19.154 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:19.413 true 00:12:19.413 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:19.672 [2024-07-12 22:19:29.850467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:19.672 [2024-07-12 22:19:29.850514] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:19.672 [2024-07-12 22:19:29.850537] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b1e0d0 00:12:19.672 [2024-07-12 22:19:29.850556] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:19.672 [2024-07-12 22:19:29.852484] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:19.672 [2024-07-12 22:19:29.852517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:19.672 BaseBdev1 00:12:19.672 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:19.672 22:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:19.930 BaseBdev2_malloc 00:12:19.930 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:20.188 true 00:12:20.188 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:20.447 [2024-07-12 22:19:30.602376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:20.447 [2024-07-12 22:19:30.602425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:20.447 [2024-07-12 22:19:30.602445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b22910 00:12:20.447 [2024-07-12 22:19:30.602458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:20.447 [2024-07-12 22:19:30.604056] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:20.447 [2024-07-12 22:19:30.604085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:20.447 BaseBdev2 00:12:20.447 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:20.705 [2024-07-12 22:19:30.847056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:20.705 [2024-07-12 22:19:30.848440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:20.705 [2024-07-12 22:19:30.848641] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b24320 00:12:20.705 [2024-07-12 22:19:30.848654] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:20.705 [2024-07-12 22:19:30.848855] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b23270 00:12:20.705 [2024-07-12 22:19:30.849013] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b24320 00:12:20.705 [2024-07-12 22:19:30.849023] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b24320 00:12:20.705 [2024-07-12 22:19:30.849132] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.705 22:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:20.964 22:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.964 "name": "raid_bdev1", 00:12:20.964 "uuid": "e3103748-68bd-42e0-bdb0-8f99160d45de", 00:12:20.964 "strip_size_kb": 64, 00:12:20.964 "state": "online", 00:12:20.964 "raid_level": "raid0", 00:12:20.964 "superblock": true, 00:12:20.964 "num_base_bdevs": 2, 00:12:20.964 "num_base_bdevs_discovered": 2, 00:12:20.964 "num_base_bdevs_operational": 2, 00:12:20.964 "base_bdevs_list": [ 00:12:20.964 { 00:12:20.964 "name": "BaseBdev1", 00:12:20.964 "uuid": "45b3fb24-e3f8-591e-ab56-40d81875dbf3", 00:12:20.964 "is_configured": true, 00:12:20.964 "data_offset": 2048, 00:12:20.964 "data_size": 63488 00:12:20.964 }, 00:12:20.964 { 00:12:20.964 "name": "BaseBdev2", 00:12:20.964 "uuid": "7e3a0e60-4dd1-5a07-b83c-f72cc57d584d", 00:12:20.964 "is_configured": true, 00:12:20.964 "data_offset": 2048, 00:12:20.964 "data_size": 63488 00:12:20.964 } 00:12:20.964 ] 00:12:20.964 }' 00:12:20.964 22:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.964 22:19:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.531 22:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:21.531 22:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:21.531 [2024-07-12 22:19:31.809886] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b1f9b0 00:12:22.467 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.783 22:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:23.041 22:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.041 "name": "raid_bdev1", 00:12:23.041 "uuid": "e3103748-68bd-42e0-bdb0-8f99160d45de", 00:12:23.041 "strip_size_kb": 64, 00:12:23.041 "state": "online", 00:12:23.041 "raid_level": "raid0", 00:12:23.041 "superblock": true, 00:12:23.041 "num_base_bdevs": 2, 00:12:23.041 "num_base_bdevs_discovered": 2, 00:12:23.041 "num_base_bdevs_operational": 2, 00:12:23.041 "base_bdevs_list": [ 00:12:23.041 { 00:12:23.041 "name": "BaseBdev1", 00:12:23.041 "uuid": "45b3fb24-e3f8-591e-ab56-40d81875dbf3", 00:12:23.041 "is_configured": true, 00:12:23.041 "data_offset": 2048, 00:12:23.041 "data_size": 63488 00:12:23.041 }, 00:12:23.041 { 00:12:23.041 "name": "BaseBdev2", 00:12:23.041 "uuid": "7e3a0e60-4dd1-5a07-b83c-f72cc57d584d", 00:12:23.041 "is_configured": true, 00:12:23.041 "data_offset": 2048, 00:12:23.041 "data_size": 63488 00:12:23.041 } 00:12:23.041 ] 00:12:23.041 }' 00:12:23.041 22:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.041 22:19:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.608 22:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:23.866 [2024-07-12 22:19:33.994079] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:23.866 [2024-07-12 22:19:33.994117] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:23.866 [2024-07-12 22:19:33.997273] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:23.866 [2024-07-12 22:19:33.997303] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:23.866 [2024-07-12 22:19:33.997331] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:23.866 [2024-07-12 22:19:33.997342] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b24320 name raid_bdev1, state offline 00:12:23.866 0 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3425486 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3425486 ']' 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3425486 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3425486 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3425486' 00:12:23.866 killing process with pid 3425486 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3425486 00:12:23.866 [2024-07-12 22:19:34.063651] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:23.866 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3425486 00:12:23.866 [2024-07-12 22:19:34.074422] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tn5V3zIwja 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:12:24.124 00:12:24.124 real 0m6.159s 00:12:24.124 user 0m9.555s 00:12:24.124 sys 0m1.115s 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:24.124 22:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.124 ************************************ 00:12:24.124 END TEST raid_read_error_test 00:12:24.124 ************************************ 00:12:24.124 22:19:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:24.124 22:19:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:24.124 22:19:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:24.124 22:19:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:24.124 22:19:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.124 ************************************ 00:12:24.124 START TEST raid_write_error_test 00:12:24.124 ************************************ 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:24.124 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.BssezcFxAp 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3426460 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3426460 /var/tmp/spdk-raid.sock 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3426460 ']' 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.125 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:24.125 22:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.383 [2024-07-12 22:19:34.460337] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:24.383 [2024-07-12 22:19:34.460403] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3426460 ] 00:12:24.383 [2024-07-12 22:19:34.590419] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.383 [2024-07-12 22:19:34.696485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.648 [2024-07-12 22:19:34.762295] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.648 [2024-07-12 22:19:34.762335] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.234 22:19:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:25.234 22:19:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:25.234 22:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:25.234 22:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:25.506 BaseBdev1_malloc 00:12:25.506 22:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:25.764 true 00:12:25.764 22:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:26.022 [2024-07-12 22:19:36.116298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:26.022 [2024-07-12 22:19:36.116345] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.022 [2024-07-12 22:19:36.116367] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc280d0 00:12:26.023 [2024-07-12 22:19:36.116380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.023 [2024-07-12 22:19:36.118293] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.023 [2024-07-12 22:19:36.118326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:26.023 BaseBdev1 00:12:26.023 22:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:26.023 22:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:26.281 BaseBdev2_malloc 00:12:26.281 22:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:26.539 true 00:12:26.539 22:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:26.539 [2024-07-12 22:19:36.848268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:26.539 [2024-07-12 22:19:36.848314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.539 [2024-07-12 22:19:36.848336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc2c910 00:12:26.539 [2024-07-12 22:19:36.848348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.539 [2024-07-12 22:19:36.849916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.539 [2024-07-12 22:19:36.849952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:26.539 BaseBdev2 00:12:26.797 22:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:26.797 [2024-07-12 22:19:37.092956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.797 [2024-07-12 22:19:37.094311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:26.797 [2024-07-12 22:19:37.094510] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc2e320 00:12:26.797 [2024-07-12 22:19:37.094524] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:26.797 [2024-07-12 22:19:37.094722] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc2d270 00:12:26.797 [2024-07-12 22:19:37.094868] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc2e320 00:12:26.797 [2024-07-12 22:19:37.094878] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc2e320 00:12:26.797 [2024-07-12 22:19:37.094996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.797 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:27.055 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.055 "name": "raid_bdev1", 00:12:27.055 "uuid": "37a222ec-5f64-449c-bf2f-e6965b72e5d9", 00:12:27.055 "strip_size_kb": 64, 00:12:27.055 "state": "online", 00:12:27.055 "raid_level": "raid0", 00:12:27.055 "superblock": true, 00:12:27.055 "num_base_bdevs": 2, 00:12:27.055 "num_base_bdevs_discovered": 2, 00:12:27.055 "num_base_bdevs_operational": 2, 00:12:27.055 "base_bdevs_list": [ 00:12:27.055 { 00:12:27.055 "name": "BaseBdev1", 00:12:27.055 "uuid": "11f0af44-73c5-5c55-be03-8db3e5ce798b", 00:12:27.055 "is_configured": true, 00:12:27.055 "data_offset": 2048, 00:12:27.055 "data_size": 63488 00:12:27.055 }, 00:12:27.055 { 00:12:27.055 "name": "BaseBdev2", 00:12:27.055 "uuid": "63b42098-4f8a-58ea-a9c6-282292ae7b33", 00:12:27.056 "is_configured": true, 00:12:27.056 "data_offset": 2048, 00:12:27.056 "data_size": 63488 00:12:27.056 } 00:12:27.056 ] 00:12:27.056 }' 00:12:27.056 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.056 22:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.621 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:27.621 22:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:27.879 [2024-07-12 22:19:38.051762] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc299b0 00:12:28.812 22:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.071 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:29.330 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.330 "name": "raid_bdev1", 00:12:29.330 "uuid": "37a222ec-5f64-449c-bf2f-e6965b72e5d9", 00:12:29.330 "strip_size_kb": 64, 00:12:29.330 "state": "online", 00:12:29.330 "raid_level": "raid0", 00:12:29.330 "superblock": true, 00:12:29.330 "num_base_bdevs": 2, 00:12:29.330 "num_base_bdevs_discovered": 2, 00:12:29.330 "num_base_bdevs_operational": 2, 00:12:29.330 "base_bdevs_list": [ 00:12:29.330 { 00:12:29.330 "name": "BaseBdev1", 00:12:29.330 "uuid": "11f0af44-73c5-5c55-be03-8db3e5ce798b", 00:12:29.330 "is_configured": true, 00:12:29.330 "data_offset": 2048, 00:12:29.330 "data_size": 63488 00:12:29.330 }, 00:12:29.330 { 00:12:29.330 "name": "BaseBdev2", 00:12:29.330 "uuid": "63b42098-4f8a-58ea-a9c6-282292ae7b33", 00:12:29.330 "is_configured": true, 00:12:29.330 "data_offset": 2048, 00:12:29.330 "data_size": 63488 00:12:29.330 } 00:12:29.330 ] 00:12:29.330 }' 00:12:29.330 22:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.330 22:19:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.896 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:30.155 [2024-07-12 22:19:40.265000] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:30.155 [2024-07-12 22:19:40.265042] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:30.155 [2024-07-12 22:19:40.268225] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:30.155 [2024-07-12 22:19:40.268256] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:30.155 [2024-07-12 22:19:40.268283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:30.155 [2024-07-12 22:19:40.268295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2e320 name raid_bdev1, state offline 00:12:30.155 0 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3426460 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3426460 ']' 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3426460 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3426460 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3426460' 00:12:30.155 killing process with pid 3426460 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3426460 00:12:30.155 [2024-07-12 22:19:40.336419] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:30.155 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3426460 00:12:30.155 [2024-07-12 22:19:40.346685] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.BssezcFxAp 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:30.415 00:12:30.415 real 0m6.183s 00:12:30.415 user 0m9.665s 00:12:30.415 sys 0m1.094s 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:30.415 22:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.415 ************************************ 00:12:30.415 END TEST raid_write_error_test 00:12:30.415 ************************************ 00:12:30.415 22:19:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:30.415 22:19:40 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:30.415 22:19:40 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:30.415 22:19:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:30.415 22:19:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:30.415 22:19:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:30.415 ************************************ 00:12:30.415 START TEST raid_state_function_test 00:12:30.415 ************************************ 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3427423 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3427423' 00:12:30.415 Process raid pid: 3427423 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3427423 /var/tmp/spdk-raid.sock 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3427423 ']' 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:30.415 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:30.415 22:19:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.415 [2024-07-12 22:19:40.731682] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:30.415 [2024-07-12 22:19:40.731751] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:30.674 [2024-07-12 22:19:40.861868] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:30.674 [2024-07-12 22:19:40.964780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:30.933 [2024-07-12 22:19:41.020823] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.933 [2024-07-12 22:19:41.020851] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.500 22:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:31.500 22:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:31.500 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:31.759 [2024-07-12 22:19:41.882797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:31.759 [2024-07-12 22:19:41.882842] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:31.759 [2024-07-12 22:19:41.882854] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:31.759 [2024-07-12 22:19:41.882866] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.759 22:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.017 22:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.017 "name": "Existed_Raid", 00:12:32.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.017 "strip_size_kb": 64, 00:12:32.017 "state": "configuring", 00:12:32.017 "raid_level": "concat", 00:12:32.017 "superblock": false, 00:12:32.017 "num_base_bdevs": 2, 00:12:32.017 "num_base_bdevs_discovered": 0, 00:12:32.017 "num_base_bdevs_operational": 2, 00:12:32.017 "base_bdevs_list": [ 00:12:32.017 { 00:12:32.017 "name": "BaseBdev1", 00:12:32.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.017 "is_configured": false, 00:12:32.017 "data_offset": 0, 00:12:32.017 "data_size": 0 00:12:32.017 }, 00:12:32.017 { 00:12:32.017 "name": "BaseBdev2", 00:12:32.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.017 "is_configured": false, 00:12:32.017 "data_offset": 0, 00:12:32.017 "data_size": 0 00:12:32.017 } 00:12:32.017 ] 00:12:32.017 }' 00:12:32.017 22:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.017 22:19:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.582 22:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:32.840 [2024-07-12 22:19:42.973531] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:32.840 [2024-07-12 22:19:42.973565] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24afa80 name Existed_Raid, state configuring 00:12:32.840 22:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:33.098 [2024-07-12 22:19:43.218198] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:33.098 [2024-07-12 22:19:43.218230] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:33.098 [2024-07-12 22:19:43.218240] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.098 [2024-07-12 22:19:43.218251] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.098 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:33.356 [2024-07-12 22:19:43.472881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:33.356 BaseBdev1 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:33.356 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:33.614 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:33.871 [ 00:12:33.871 { 00:12:33.871 "name": "BaseBdev1", 00:12:33.871 "aliases": [ 00:12:33.871 "9b4fd7fe-7b89-422d-a518-1e7b0777e87d" 00:12:33.871 ], 00:12:33.871 "product_name": "Malloc disk", 00:12:33.871 "block_size": 512, 00:12:33.871 "num_blocks": 65536, 00:12:33.871 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:33.871 "assigned_rate_limits": { 00:12:33.871 "rw_ios_per_sec": 0, 00:12:33.871 "rw_mbytes_per_sec": 0, 00:12:33.871 "r_mbytes_per_sec": 0, 00:12:33.871 "w_mbytes_per_sec": 0 00:12:33.871 }, 00:12:33.871 "claimed": true, 00:12:33.871 "claim_type": "exclusive_write", 00:12:33.871 "zoned": false, 00:12:33.871 "supported_io_types": { 00:12:33.871 "read": true, 00:12:33.871 "write": true, 00:12:33.871 "unmap": true, 00:12:33.871 "flush": true, 00:12:33.871 "reset": true, 00:12:33.871 "nvme_admin": false, 00:12:33.871 "nvme_io": false, 00:12:33.871 "nvme_io_md": false, 00:12:33.871 "write_zeroes": true, 00:12:33.871 "zcopy": true, 00:12:33.871 "get_zone_info": false, 00:12:33.871 "zone_management": false, 00:12:33.871 "zone_append": false, 00:12:33.871 "compare": false, 00:12:33.871 "compare_and_write": false, 00:12:33.871 "abort": true, 00:12:33.871 "seek_hole": false, 00:12:33.871 "seek_data": false, 00:12:33.871 "copy": true, 00:12:33.871 "nvme_iov_md": false 00:12:33.871 }, 00:12:33.871 "memory_domains": [ 00:12:33.871 { 00:12:33.871 "dma_device_id": "system", 00:12:33.871 "dma_device_type": 1 00:12:33.871 }, 00:12:33.871 { 00:12:33.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.871 "dma_device_type": 2 00:12:33.871 } 00:12:33.872 ], 00:12:33.872 "driver_specific": {} 00:12:33.872 } 00:12:33.872 ] 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.872 22:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.130 22:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.130 "name": "Existed_Raid", 00:12:34.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.130 "strip_size_kb": 64, 00:12:34.130 "state": "configuring", 00:12:34.130 "raid_level": "concat", 00:12:34.130 "superblock": false, 00:12:34.130 "num_base_bdevs": 2, 00:12:34.130 "num_base_bdevs_discovered": 1, 00:12:34.130 "num_base_bdevs_operational": 2, 00:12:34.130 "base_bdevs_list": [ 00:12:34.130 { 00:12:34.130 "name": "BaseBdev1", 00:12:34.130 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:34.130 "is_configured": true, 00:12:34.130 "data_offset": 0, 00:12:34.130 "data_size": 65536 00:12:34.130 }, 00:12:34.130 { 00:12:34.130 "name": "BaseBdev2", 00:12:34.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.130 "is_configured": false, 00:12:34.130 "data_offset": 0, 00:12:34.130 "data_size": 0 00:12:34.130 } 00:12:34.130 ] 00:12:34.130 }' 00:12:34.130 22:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.130 22:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.695 22:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:34.953 [2024-07-12 22:19:45.049120] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:34.953 [2024-07-12 22:19:45.049162] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24af350 name Existed_Raid, state configuring 00:12:34.953 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:35.211 [2024-07-12 22:19:45.293795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:35.211 [2024-07-12 22:19:45.295306] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.211 [2024-07-12 22:19:45.295343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.211 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.469 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.469 "name": "Existed_Raid", 00:12:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.469 "strip_size_kb": 64, 00:12:35.469 "state": "configuring", 00:12:35.469 "raid_level": "concat", 00:12:35.469 "superblock": false, 00:12:35.469 "num_base_bdevs": 2, 00:12:35.469 "num_base_bdevs_discovered": 1, 00:12:35.469 "num_base_bdevs_operational": 2, 00:12:35.469 "base_bdevs_list": [ 00:12:35.469 { 00:12:35.469 "name": "BaseBdev1", 00:12:35.469 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:35.469 "is_configured": true, 00:12:35.469 "data_offset": 0, 00:12:35.469 "data_size": 65536 00:12:35.469 }, 00:12:35.469 { 00:12:35.469 "name": "BaseBdev2", 00:12:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.469 "is_configured": false, 00:12:35.469 "data_offset": 0, 00:12:35.469 "data_size": 0 00:12:35.469 } 00:12:35.469 ] 00:12:35.469 }' 00:12:35.469 22:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.469 22:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.035 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:36.293 [2024-07-12 22:19:46.388106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:36.293 [2024-07-12 22:19:46.388145] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24b0000 00:12:36.293 [2024-07-12 22:19:46.388154] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:36.293 [2024-07-12 22:19:46.388342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ca0c0 00:12:36.293 [2024-07-12 22:19:46.388465] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24b0000 00:12:36.293 [2024-07-12 22:19:46.388475] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24b0000 00:12:36.293 [2024-07-12 22:19:46.388644] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:36.293 BaseBdev2 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:36.293 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:36.552 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:36.810 [ 00:12:36.810 { 00:12:36.810 "name": "BaseBdev2", 00:12:36.810 "aliases": [ 00:12:36.810 "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef" 00:12:36.810 ], 00:12:36.810 "product_name": "Malloc disk", 00:12:36.810 "block_size": 512, 00:12:36.810 "num_blocks": 65536, 00:12:36.810 "uuid": "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef", 00:12:36.810 "assigned_rate_limits": { 00:12:36.810 "rw_ios_per_sec": 0, 00:12:36.810 "rw_mbytes_per_sec": 0, 00:12:36.810 "r_mbytes_per_sec": 0, 00:12:36.810 "w_mbytes_per_sec": 0 00:12:36.810 }, 00:12:36.810 "claimed": true, 00:12:36.810 "claim_type": "exclusive_write", 00:12:36.810 "zoned": false, 00:12:36.810 "supported_io_types": { 00:12:36.810 "read": true, 00:12:36.810 "write": true, 00:12:36.810 "unmap": true, 00:12:36.810 "flush": true, 00:12:36.810 "reset": true, 00:12:36.810 "nvme_admin": false, 00:12:36.810 "nvme_io": false, 00:12:36.810 "nvme_io_md": false, 00:12:36.810 "write_zeroes": true, 00:12:36.810 "zcopy": true, 00:12:36.810 "get_zone_info": false, 00:12:36.810 "zone_management": false, 00:12:36.810 "zone_append": false, 00:12:36.810 "compare": false, 00:12:36.810 "compare_and_write": false, 00:12:36.810 "abort": true, 00:12:36.810 "seek_hole": false, 00:12:36.810 "seek_data": false, 00:12:36.810 "copy": true, 00:12:36.810 "nvme_iov_md": false 00:12:36.810 }, 00:12:36.810 "memory_domains": [ 00:12:36.810 { 00:12:36.810 "dma_device_id": "system", 00:12:36.810 "dma_device_type": 1 00:12:36.810 }, 00:12:36.810 { 00:12:36.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.810 "dma_device_type": 2 00:12:36.810 } 00:12:36.810 ], 00:12:36.810 "driver_specific": {} 00:12:36.810 } 00:12:36.810 ] 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.810 22:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.068 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.068 "name": "Existed_Raid", 00:12:37.068 "uuid": "987820c8-4819-448e-b1cd-511bb2f438db", 00:12:37.068 "strip_size_kb": 64, 00:12:37.068 "state": "online", 00:12:37.068 "raid_level": "concat", 00:12:37.068 "superblock": false, 00:12:37.068 "num_base_bdevs": 2, 00:12:37.068 "num_base_bdevs_discovered": 2, 00:12:37.068 "num_base_bdevs_operational": 2, 00:12:37.068 "base_bdevs_list": [ 00:12:37.068 { 00:12:37.068 "name": "BaseBdev1", 00:12:37.068 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:37.068 "is_configured": true, 00:12:37.068 "data_offset": 0, 00:12:37.068 "data_size": 65536 00:12:37.068 }, 00:12:37.068 { 00:12:37.068 "name": "BaseBdev2", 00:12:37.068 "uuid": "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef", 00:12:37.068 "is_configured": true, 00:12:37.068 "data_offset": 0, 00:12:37.068 "data_size": 65536 00:12:37.068 } 00:12:37.068 ] 00:12:37.068 }' 00:12:37.068 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.068 22:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:37.635 [2024-07-12 22:19:47.908425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:37.635 "name": "Existed_Raid", 00:12:37.635 "aliases": [ 00:12:37.635 "987820c8-4819-448e-b1cd-511bb2f438db" 00:12:37.635 ], 00:12:37.635 "product_name": "Raid Volume", 00:12:37.635 "block_size": 512, 00:12:37.635 "num_blocks": 131072, 00:12:37.635 "uuid": "987820c8-4819-448e-b1cd-511bb2f438db", 00:12:37.635 "assigned_rate_limits": { 00:12:37.635 "rw_ios_per_sec": 0, 00:12:37.635 "rw_mbytes_per_sec": 0, 00:12:37.635 "r_mbytes_per_sec": 0, 00:12:37.635 "w_mbytes_per_sec": 0 00:12:37.635 }, 00:12:37.635 "claimed": false, 00:12:37.635 "zoned": false, 00:12:37.635 "supported_io_types": { 00:12:37.635 "read": true, 00:12:37.635 "write": true, 00:12:37.635 "unmap": true, 00:12:37.635 "flush": true, 00:12:37.635 "reset": true, 00:12:37.635 "nvme_admin": false, 00:12:37.635 "nvme_io": false, 00:12:37.635 "nvme_io_md": false, 00:12:37.635 "write_zeroes": true, 00:12:37.635 "zcopy": false, 00:12:37.635 "get_zone_info": false, 00:12:37.635 "zone_management": false, 00:12:37.635 "zone_append": false, 00:12:37.635 "compare": false, 00:12:37.635 "compare_and_write": false, 00:12:37.635 "abort": false, 00:12:37.635 "seek_hole": false, 00:12:37.635 "seek_data": false, 00:12:37.635 "copy": false, 00:12:37.635 "nvme_iov_md": false 00:12:37.635 }, 00:12:37.635 "memory_domains": [ 00:12:37.635 { 00:12:37.635 "dma_device_id": "system", 00:12:37.635 "dma_device_type": 1 00:12:37.635 }, 00:12:37.635 { 00:12:37.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.635 "dma_device_type": 2 00:12:37.635 }, 00:12:37.635 { 00:12:37.635 "dma_device_id": "system", 00:12:37.635 "dma_device_type": 1 00:12:37.635 }, 00:12:37.635 { 00:12:37.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.635 "dma_device_type": 2 00:12:37.635 } 00:12:37.635 ], 00:12:37.635 "driver_specific": { 00:12:37.635 "raid": { 00:12:37.635 "uuid": "987820c8-4819-448e-b1cd-511bb2f438db", 00:12:37.635 "strip_size_kb": 64, 00:12:37.635 "state": "online", 00:12:37.635 "raid_level": "concat", 00:12:37.635 "superblock": false, 00:12:37.635 "num_base_bdevs": 2, 00:12:37.635 "num_base_bdevs_discovered": 2, 00:12:37.635 "num_base_bdevs_operational": 2, 00:12:37.635 "base_bdevs_list": [ 00:12:37.635 { 00:12:37.635 "name": "BaseBdev1", 00:12:37.635 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:37.635 "is_configured": true, 00:12:37.635 "data_offset": 0, 00:12:37.635 "data_size": 65536 00:12:37.635 }, 00:12:37.635 { 00:12:37.635 "name": "BaseBdev2", 00:12:37.635 "uuid": "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef", 00:12:37.635 "is_configured": true, 00:12:37.635 "data_offset": 0, 00:12:37.635 "data_size": 65536 00:12:37.635 } 00:12:37.635 ] 00:12:37.635 } 00:12:37.635 } 00:12:37.635 }' 00:12:37.635 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:37.893 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:37.894 BaseBdev2' 00:12:37.894 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.894 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:37.894 22:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.894 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.894 "name": "BaseBdev1", 00:12:37.894 "aliases": [ 00:12:37.894 "9b4fd7fe-7b89-422d-a518-1e7b0777e87d" 00:12:37.894 ], 00:12:37.894 "product_name": "Malloc disk", 00:12:37.894 "block_size": 512, 00:12:37.894 "num_blocks": 65536, 00:12:37.894 "uuid": "9b4fd7fe-7b89-422d-a518-1e7b0777e87d", 00:12:37.894 "assigned_rate_limits": { 00:12:37.894 "rw_ios_per_sec": 0, 00:12:37.894 "rw_mbytes_per_sec": 0, 00:12:37.894 "r_mbytes_per_sec": 0, 00:12:37.894 "w_mbytes_per_sec": 0 00:12:37.894 }, 00:12:37.894 "claimed": true, 00:12:37.894 "claim_type": "exclusive_write", 00:12:37.894 "zoned": false, 00:12:37.894 "supported_io_types": { 00:12:37.894 "read": true, 00:12:37.894 "write": true, 00:12:37.894 "unmap": true, 00:12:37.894 "flush": true, 00:12:37.894 "reset": true, 00:12:37.894 "nvme_admin": false, 00:12:37.894 "nvme_io": false, 00:12:37.894 "nvme_io_md": false, 00:12:37.894 "write_zeroes": true, 00:12:37.894 "zcopy": true, 00:12:37.894 "get_zone_info": false, 00:12:37.894 "zone_management": false, 00:12:37.894 "zone_append": false, 00:12:37.894 "compare": false, 00:12:37.894 "compare_and_write": false, 00:12:37.894 "abort": true, 00:12:37.894 "seek_hole": false, 00:12:37.894 "seek_data": false, 00:12:37.894 "copy": true, 00:12:37.894 "nvme_iov_md": false 00:12:37.894 }, 00:12:37.894 "memory_domains": [ 00:12:37.894 { 00:12:37.894 "dma_device_id": "system", 00:12:37.894 "dma_device_type": 1 00:12:37.894 }, 00:12:37.894 { 00:12:37.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.894 "dma_device_type": 2 00:12:37.894 } 00:12:37.894 ], 00:12:37.894 "driver_specific": {} 00:12:37.894 }' 00:12:37.894 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.894 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.152 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.411 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.411 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.411 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:38.411 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.669 "name": "BaseBdev2", 00:12:38.669 "aliases": [ 00:12:38.669 "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef" 00:12:38.669 ], 00:12:38.669 "product_name": "Malloc disk", 00:12:38.669 "block_size": 512, 00:12:38.669 "num_blocks": 65536, 00:12:38.669 "uuid": "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef", 00:12:38.669 "assigned_rate_limits": { 00:12:38.669 "rw_ios_per_sec": 0, 00:12:38.669 "rw_mbytes_per_sec": 0, 00:12:38.669 "r_mbytes_per_sec": 0, 00:12:38.669 "w_mbytes_per_sec": 0 00:12:38.669 }, 00:12:38.669 "claimed": true, 00:12:38.669 "claim_type": "exclusive_write", 00:12:38.669 "zoned": false, 00:12:38.669 "supported_io_types": { 00:12:38.669 "read": true, 00:12:38.669 "write": true, 00:12:38.669 "unmap": true, 00:12:38.669 "flush": true, 00:12:38.669 "reset": true, 00:12:38.669 "nvme_admin": false, 00:12:38.669 "nvme_io": false, 00:12:38.669 "nvme_io_md": false, 00:12:38.669 "write_zeroes": true, 00:12:38.669 "zcopy": true, 00:12:38.669 "get_zone_info": false, 00:12:38.669 "zone_management": false, 00:12:38.669 "zone_append": false, 00:12:38.669 "compare": false, 00:12:38.669 "compare_and_write": false, 00:12:38.669 "abort": true, 00:12:38.669 "seek_hole": false, 00:12:38.669 "seek_data": false, 00:12:38.669 "copy": true, 00:12:38.669 "nvme_iov_md": false 00:12:38.669 }, 00:12:38.669 "memory_domains": [ 00:12:38.669 { 00:12:38.669 "dma_device_id": "system", 00:12:38.669 "dma_device_type": 1 00:12:38.669 }, 00:12:38.669 { 00:12:38.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.669 "dma_device_type": 2 00:12:38.669 } 00:12:38.669 ], 00:12:38.669 "driver_specific": {} 00:12:38.669 }' 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.669 22:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.928 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.928 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.928 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.928 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.928 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:39.186 [2024-07-12 22:19:49.319947] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:39.186 [2024-07-12 22:19:49.319980] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.186 [2024-07-12 22:19:49.320023] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.186 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.445 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.445 "name": "Existed_Raid", 00:12:39.445 "uuid": "987820c8-4819-448e-b1cd-511bb2f438db", 00:12:39.445 "strip_size_kb": 64, 00:12:39.445 "state": "offline", 00:12:39.445 "raid_level": "concat", 00:12:39.445 "superblock": false, 00:12:39.445 "num_base_bdevs": 2, 00:12:39.445 "num_base_bdevs_discovered": 1, 00:12:39.445 "num_base_bdevs_operational": 1, 00:12:39.445 "base_bdevs_list": [ 00:12:39.445 { 00:12:39.445 "name": null, 00:12:39.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.445 "is_configured": false, 00:12:39.445 "data_offset": 0, 00:12:39.445 "data_size": 65536 00:12:39.445 }, 00:12:39.445 { 00:12:39.445 "name": "BaseBdev2", 00:12:39.445 "uuid": "9af7ecac-8242-4ea4-9c17-841fbf6cc7ef", 00:12:39.445 "is_configured": true, 00:12:39.445 "data_offset": 0, 00:12:39.445 "data_size": 65536 00:12:39.445 } 00:12:39.445 ] 00:12:39.445 }' 00:12:39.445 22:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.445 22:19:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.013 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:40.013 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:40.013 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.013 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:40.271 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:40.271 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:40.271 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:40.530 [2024-07-12 22:19:50.676569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:40.530 [2024-07-12 22:19:50.676626] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b0000 name Existed_Raid, state offline 00:12:40.530 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:40.530 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:40.530 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.530 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3427423 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3427423 ']' 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3427423 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:40.788 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3427423 00:12:40.789 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:40.789 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:40.789 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3427423' 00:12:40.789 killing process with pid 3427423 00:12:40.789 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3427423 00:12:40.789 [2024-07-12 22:19:50.996515] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:40.789 22:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3427423 00:12:40.789 [2024-07-12 22:19:50.997515] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:41.048 00:12:41.048 real 0m10.561s 00:12:41.048 user 0m18.774s 00:12:41.048 sys 0m1.948s 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.048 ************************************ 00:12:41.048 END TEST raid_state_function_test 00:12:41.048 ************************************ 00:12:41.048 22:19:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:41.048 22:19:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:41.048 22:19:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:41.048 22:19:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:41.048 22:19:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:41.048 ************************************ 00:12:41.048 START TEST raid_state_function_test_sb 00:12:41.048 ************************************ 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3428982 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3428982' 00:12:41.048 Process raid pid: 3428982 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3428982 /var/tmp/spdk-raid.sock 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3428982 ']' 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:41.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:41.048 22:19:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:41.048 [2024-07-12 22:19:51.367396] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:41.048 [2024-07-12 22:19:51.367465] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:41.307 [2024-07-12 22:19:51.498944] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:41.307 [2024-07-12 22:19:51.605346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:41.565 [2024-07-12 22:19:51.673142] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:41.565 [2024-07-12 22:19:51.673181] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:42.164 22:19:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:42.164 22:19:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:42.164 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:42.423 [2024-07-12 22:19:52.512433] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:42.423 [2024-07-12 22:19:52.512479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:42.423 [2024-07-12 22:19:52.512494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:42.423 [2024-07-12 22:19:52.512506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.423 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.682 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.682 "name": "Existed_Raid", 00:12:42.682 "uuid": "73da67cf-d5c6-44a0-8ac9-10c042b75fbe", 00:12:42.682 "strip_size_kb": 64, 00:12:42.683 "state": "configuring", 00:12:42.683 "raid_level": "concat", 00:12:42.683 "superblock": true, 00:12:42.683 "num_base_bdevs": 2, 00:12:42.683 "num_base_bdevs_discovered": 0, 00:12:42.683 "num_base_bdevs_operational": 2, 00:12:42.683 "base_bdevs_list": [ 00:12:42.683 { 00:12:42.683 "name": "BaseBdev1", 00:12:42.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.683 "is_configured": false, 00:12:42.683 "data_offset": 0, 00:12:42.683 "data_size": 0 00:12:42.683 }, 00:12:42.683 { 00:12:42.683 "name": "BaseBdev2", 00:12:42.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.683 "is_configured": false, 00:12:42.683 "data_offset": 0, 00:12:42.683 "data_size": 0 00:12:42.683 } 00:12:42.683 ] 00:12:42.683 }' 00:12:42.683 22:19:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.683 22:19:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.250 22:19:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:43.509 [2024-07-12 22:19:53.579089] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:43.509 [2024-07-12 22:19:53.579121] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afba80 name Existed_Raid, state configuring 00:12:43.509 22:19:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:43.509 [2024-07-12 22:19:53.823756] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:43.509 [2024-07-12 22:19:53.823785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:43.509 [2024-07-12 22:19:53.823795] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:43.509 [2024-07-12 22:19:53.823807] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:43.767 22:19:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:43.768 [2024-07-12 22:19:54.078423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:43.768 BaseBdev1 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:44.027 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:44.286 [ 00:12:44.286 { 00:12:44.286 "name": "BaseBdev1", 00:12:44.286 "aliases": [ 00:12:44.286 "e4ec6c3e-e0f7-4321-ac66-190540cc271a" 00:12:44.286 ], 00:12:44.286 "product_name": "Malloc disk", 00:12:44.286 "block_size": 512, 00:12:44.286 "num_blocks": 65536, 00:12:44.286 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:44.286 "assigned_rate_limits": { 00:12:44.286 "rw_ios_per_sec": 0, 00:12:44.286 "rw_mbytes_per_sec": 0, 00:12:44.286 "r_mbytes_per_sec": 0, 00:12:44.286 "w_mbytes_per_sec": 0 00:12:44.286 }, 00:12:44.286 "claimed": true, 00:12:44.286 "claim_type": "exclusive_write", 00:12:44.286 "zoned": false, 00:12:44.286 "supported_io_types": { 00:12:44.286 "read": true, 00:12:44.286 "write": true, 00:12:44.286 "unmap": true, 00:12:44.286 "flush": true, 00:12:44.286 "reset": true, 00:12:44.286 "nvme_admin": false, 00:12:44.286 "nvme_io": false, 00:12:44.286 "nvme_io_md": false, 00:12:44.286 "write_zeroes": true, 00:12:44.286 "zcopy": true, 00:12:44.286 "get_zone_info": false, 00:12:44.286 "zone_management": false, 00:12:44.286 "zone_append": false, 00:12:44.286 "compare": false, 00:12:44.286 "compare_and_write": false, 00:12:44.286 "abort": true, 00:12:44.286 "seek_hole": false, 00:12:44.286 "seek_data": false, 00:12:44.286 "copy": true, 00:12:44.286 "nvme_iov_md": false 00:12:44.286 }, 00:12:44.286 "memory_domains": [ 00:12:44.286 { 00:12:44.286 "dma_device_id": "system", 00:12:44.286 "dma_device_type": 1 00:12:44.286 }, 00:12:44.286 { 00:12:44.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.286 "dma_device_type": 2 00:12:44.286 } 00:12:44.286 ], 00:12:44.286 "driver_specific": {} 00:12:44.286 } 00:12:44.286 ] 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.286 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.545 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.545 "name": "Existed_Raid", 00:12:44.545 "uuid": "384c3233-ff73-45b7-9d7a-606d265b6fea", 00:12:44.545 "strip_size_kb": 64, 00:12:44.545 "state": "configuring", 00:12:44.545 "raid_level": "concat", 00:12:44.545 "superblock": true, 00:12:44.545 "num_base_bdevs": 2, 00:12:44.545 "num_base_bdevs_discovered": 1, 00:12:44.545 "num_base_bdevs_operational": 2, 00:12:44.545 "base_bdevs_list": [ 00:12:44.545 { 00:12:44.545 "name": "BaseBdev1", 00:12:44.545 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:44.545 "is_configured": true, 00:12:44.545 "data_offset": 2048, 00:12:44.545 "data_size": 63488 00:12:44.545 }, 00:12:44.545 { 00:12:44.545 "name": "BaseBdev2", 00:12:44.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.545 "is_configured": false, 00:12:44.545 "data_offset": 0, 00:12:44.545 "data_size": 0 00:12:44.545 } 00:12:44.545 ] 00:12:44.545 }' 00:12:44.545 22:19:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.545 22:19:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:45.113 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:45.370 [2024-07-12 22:19:55.638583] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:45.371 [2024-07-12 22:19:55.638622] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afb350 name Existed_Raid, state configuring 00:12:45.371 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:45.629 [2024-07-12 22:19:55.887278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:45.629 [2024-07-12 22:19:55.888771] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:45.629 [2024-07-12 22:19:55.888814] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.629 22:19:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.888 22:19:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.888 "name": "Existed_Raid", 00:12:45.888 "uuid": "0c36ba8f-c724-4260-812f-ff222e8442cc", 00:12:45.888 "strip_size_kb": 64, 00:12:45.888 "state": "configuring", 00:12:45.888 "raid_level": "concat", 00:12:45.888 "superblock": true, 00:12:45.888 "num_base_bdevs": 2, 00:12:45.888 "num_base_bdevs_discovered": 1, 00:12:45.888 "num_base_bdevs_operational": 2, 00:12:45.888 "base_bdevs_list": [ 00:12:45.888 { 00:12:45.888 "name": "BaseBdev1", 00:12:45.888 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:45.888 "is_configured": true, 00:12:45.888 "data_offset": 2048, 00:12:45.888 "data_size": 63488 00:12:45.888 }, 00:12:45.888 { 00:12:45.888 "name": "BaseBdev2", 00:12:45.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.888 "is_configured": false, 00:12:45.888 "data_offset": 0, 00:12:45.888 "data_size": 0 00:12:45.888 } 00:12:45.888 ] 00:12:45.888 }' 00:12:45.888 22:19:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.888 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.507 22:19:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:46.766 [2024-07-12 22:19:56.977469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:46.766 [2024-07-12 22:19:56.977617] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afc000 00:12:46.766 [2024-07-12 22:19:56.977631] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:46.766 [2024-07-12 22:19:56.977802] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a160c0 00:12:46.766 [2024-07-12 22:19:56.977916] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afc000 00:12:46.766 [2024-07-12 22:19:56.977940] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1afc000 00:12:46.766 [2024-07-12 22:19:56.978036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.766 BaseBdev2 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:46.766 22:19:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:47.026 22:19:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:47.284 [ 00:12:47.284 { 00:12:47.284 "name": "BaseBdev2", 00:12:47.284 "aliases": [ 00:12:47.284 "95d517c4-b62c-4463-bb17-cb9dca5a5df1" 00:12:47.284 ], 00:12:47.284 "product_name": "Malloc disk", 00:12:47.284 "block_size": 512, 00:12:47.284 "num_blocks": 65536, 00:12:47.284 "uuid": "95d517c4-b62c-4463-bb17-cb9dca5a5df1", 00:12:47.284 "assigned_rate_limits": { 00:12:47.284 "rw_ios_per_sec": 0, 00:12:47.284 "rw_mbytes_per_sec": 0, 00:12:47.284 "r_mbytes_per_sec": 0, 00:12:47.284 "w_mbytes_per_sec": 0 00:12:47.284 }, 00:12:47.284 "claimed": true, 00:12:47.284 "claim_type": "exclusive_write", 00:12:47.284 "zoned": false, 00:12:47.284 "supported_io_types": { 00:12:47.284 "read": true, 00:12:47.284 "write": true, 00:12:47.284 "unmap": true, 00:12:47.284 "flush": true, 00:12:47.284 "reset": true, 00:12:47.284 "nvme_admin": false, 00:12:47.284 "nvme_io": false, 00:12:47.284 "nvme_io_md": false, 00:12:47.284 "write_zeroes": true, 00:12:47.284 "zcopy": true, 00:12:47.284 "get_zone_info": false, 00:12:47.284 "zone_management": false, 00:12:47.284 "zone_append": false, 00:12:47.284 "compare": false, 00:12:47.284 "compare_and_write": false, 00:12:47.284 "abort": true, 00:12:47.284 "seek_hole": false, 00:12:47.284 "seek_data": false, 00:12:47.284 "copy": true, 00:12:47.284 "nvme_iov_md": false 00:12:47.284 }, 00:12:47.284 "memory_domains": [ 00:12:47.284 { 00:12:47.284 "dma_device_id": "system", 00:12:47.284 "dma_device_type": 1 00:12:47.284 }, 00:12:47.284 { 00:12:47.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.284 "dma_device_type": 2 00:12:47.284 } 00:12:47.284 ], 00:12:47.284 "driver_specific": {} 00:12:47.284 } 00:12:47.284 ] 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.285 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.543 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.543 "name": "Existed_Raid", 00:12:47.543 "uuid": "0c36ba8f-c724-4260-812f-ff222e8442cc", 00:12:47.543 "strip_size_kb": 64, 00:12:47.543 "state": "online", 00:12:47.543 "raid_level": "concat", 00:12:47.543 "superblock": true, 00:12:47.543 "num_base_bdevs": 2, 00:12:47.543 "num_base_bdevs_discovered": 2, 00:12:47.543 "num_base_bdevs_operational": 2, 00:12:47.543 "base_bdevs_list": [ 00:12:47.543 { 00:12:47.543 "name": "BaseBdev1", 00:12:47.543 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:47.543 "is_configured": true, 00:12:47.543 "data_offset": 2048, 00:12:47.543 "data_size": 63488 00:12:47.543 }, 00:12:47.543 { 00:12:47.543 "name": "BaseBdev2", 00:12:47.543 "uuid": "95d517c4-b62c-4463-bb17-cb9dca5a5df1", 00:12:47.543 "is_configured": true, 00:12:47.543 "data_offset": 2048, 00:12:47.543 "data_size": 63488 00:12:47.543 } 00:12:47.543 ] 00:12:47.543 }' 00:12:47.543 22:19:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.543 22:19:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:48.111 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:48.370 [2024-07-12 22:19:58.541938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:48.370 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:48.370 "name": "Existed_Raid", 00:12:48.370 "aliases": [ 00:12:48.370 "0c36ba8f-c724-4260-812f-ff222e8442cc" 00:12:48.370 ], 00:12:48.370 "product_name": "Raid Volume", 00:12:48.370 "block_size": 512, 00:12:48.370 "num_blocks": 126976, 00:12:48.370 "uuid": "0c36ba8f-c724-4260-812f-ff222e8442cc", 00:12:48.370 "assigned_rate_limits": { 00:12:48.370 "rw_ios_per_sec": 0, 00:12:48.370 "rw_mbytes_per_sec": 0, 00:12:48.370 "r_mbytes_per_sec": 0, 00:12:48.370 "w_mbytes_per_sec": 0 00:12:48.370 }, 00:12:48.370 "claimed": false, 00:12:48.370 "zoned": false, 00:12:48.370 "supported_io_types": { 00:12:48.370 "read": true, 00:12:48.370 "write": true, 00:12:48.370 "unmap": true, 00:12:48.370 "flush": true, 00:12:48.370 "reset": true, 00:12:48.370 "nvme_admin": false, 00:12:48.370 "nvme_io": false, 00:12:48.370 "nvme_io_md": false, 00:12:48.370 "write_zeroes": true, 00:12:48.370 "zcopy": false, 00:12:48.370 "get_zone_info": false, 00:12:48.370 "zone_management": false, 00:12:48.370 "zone_append": false, 00:12:48.370 "compare": false, 00:12:48.370 "compare_and_write": false, 00:12:48.370 "abort": false, 00:12:48.371 "seek_hole": false, 00:12:48.371 "seek_data": false, 00:12:48.371 "copy": false, 00:12:48.371 "nvme_iov_md": false 00:12:48.371 }, 00:12:48.371 "memory_domains": [ 00:12:48.371 { 00:12:48.371 "dma_device_id": "system", 00:12:48.371 "dma_device_type": 1 00:12:48.371 }, 00:12:48.371 { 00:12:48.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.371 "dma_device_type": 2 00:12:48.371 }, 00:12:48.371 { 00:12:48.371 "dma_device_id": "system", 00:12:48.371 "dma_device_type": 1 00:12:48.371 }, 00:12:48.371 { 00:12:48.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.371 "dma_device_type": 2 00:12:48.371 } 00:12:48.371 ], 00:12:48.371 "driver_specific": { 00:12:48.371 "raid": { 00:12:48.371 "uuid": "0c36ba8f-c724-4260-812f-ff222e8442cc", 00:12:48.371 "strip_size_kb": 64, 00:12:48.371 "state": "online", 00:12:48.371 "raid_level": "concat", 00:12:48.371 "superblock": true, 00:12:48.371 "num_base_bdevs": 2, 00:12:48.371 "num_base_bdevs_discovered": 2, 00:12:48.371 "num_base_bdevs_operational": 2, 00:12:48.371 "base_bdevs_list": [ 00:12:48.371 { 00:12:48.371 "name": "BaseBdev1", 00:12:48.371 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:48.371 "is_configured": true, 00:12:48.371 "data_offset": 2048, 00:12:48.371 "data_size": 63488 00:12:48.371 }, 00:12:48.371 { 00:12:48.371 "name": "BaseBdev2", 00:12:48.371 "uuid": "95d517c4-b62c-4463-bb17-cb9dca5a5df1", 00:12:48.371 "is_configured": true, 00:12:48.371 "data_offset": 2048, 00:12:48.371 "data_size": 63488 00:12:48.371 } 00:12:48.371 ] 00:12:48.371 } 00:12:48.371 } 00:12:48.371 }' 00:12:48.371 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:48.371 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:48.371 BaseBdev2' 00:12:48.371 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.371 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:48.371 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.630 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:48.630 "name": "BaseBdev1", 00:12:48.630 "aliases": [ 00:12:48.630 "e4ec6c3e-e0f7-4321-ac66-190540cc271a" 00:12:48.630 ], 00:12:48.630 "product_name": "Malloc disk", 00:12:48.630 "block_size": 512, 00:12:48.630 "num_blocks": 65536, 00:12:48.630 "uuid": "e4ec6c3e-e0f7-4321-ac66-190540cc271a", 00:12:48.630 "assigned_rate_limits": { 00:12:48.630 "rw_ios_per_sec": 0, 00:12:48.630 "rw_mbytes_per_sec": 0, 00:12:48.630 "r_mbytes_per_sec": 0, 00:12:48.630 "w_mbytes_per_sec": 0 00:12:48.630 }, 00:12:48.630 "claimed": true, 00:12:48.630 "claim_type": "exclusive_write", 00:12:48.630 "zoned": false, 00:12:48.630 "supported_io_types": { 00:12:48.630 "read": true, 00:12:48.630 "write": true, 00:12:48.630 "unmap": true, 00:12:48.630 "flush": true, 00:12:48.630 "reset": true, 00:12:48.630 "nvme_admin": false, 00:12:48.630 "nvme_io": false, 00:12:48.630 "nvme_io_md": false, 00:12:48.630 "write_zeroes": true, 00:12:48.630 "zcopy": true, 00:12:48.630 "get_zone_info": false, 00:12:48.630 "zone_management": false, 00:12:48.630 "zone_append": false, 00:12:48.630 "compare": false, 00:12:48.630 "compare_and_write": false, 00:12:48.630 "abort": true, 00:12:48.630 "seek_hole": false, 00:12:48.630 "seek_data": false, 00:12:48.630 "copy": true, 00:12:48.630 "nvme_iov_md": false 00:12:48.630 }, 00:12:48.630 "memory_domains": [ 00:12:48.630 { 00:12:48.630 "dma_device_id": "system", 00:12:48.630 "dma_device_type": 1 00:12:48.630 }, 00:12:48.630 { 00:12:48.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:48.630 "dma_device_type": 2 00:12:48.630 } 00:12:48.630 ], 00:12:48.630 "driver_specific": {} 00:12:48.630 }' 00:12:48.630 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.630 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.630 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.630 22:19:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:48.889 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:49.149 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:49.149 "name": "BaseBdev2", 00:12:49.149 "aliases": [ 00:12:49.149 "95d517c4-b62c-4463-bb17-cb9dca5a5df1" 00:12:49.149 ], 00:12:49.149 "product_name": "Malloc disk", 00:12:49.149 "block_size": 512, 00:12:49.149 "num_blocks": 65536, 00:12:49.149 "uuid": "95d517c4-b62c-4463-bb17-cb9dca5a5df1", 00:12:49.149 "assigned_rate_limits": { 00:12:49.149 "rw_ios_per_sec": 0, 00:12:49.149 "rw_mbytes_per_sec": 0, 00:12:49.149 "r_mbytes_per_sec": 0, 00:12:49.149 "w_mbytes_per_sec": 0 00:12:49.149 }, 00:12:49.149 "claimed": true, 00:12:49.149 "claim_type": "exclusive_write", 00:12:49.149 "zoned": false, 00:12:49.149 "supported_io_types": { 00:12:49.149 "read": true, 00:12:49.149 "write": true, 00:12:49.149 "unmap": true, 00:12:49.149 "flush": true, 00:12:49.149 "reset": true, 00:12:49.149 "nvme_admin": false, 00:12:49.149 "nvme_io": false, 00:12:49.149 "nvme_io_md": false, 00:12:49.149 "write_zeroes": true, 00:12:49.149 "zcopy": true, 00:12:49.149 "get_zone_info": false, 00:12:49.149 "zone_management": false, 00:12:49.149 "zone_append": false, 00:12:49.149 "compare": false, 00:12:49.149 "compare_and_write": false, 00:12:49.149 "abort": true, 00:12:49.149 "seek_hole": false, 00:12:49.149 "seek_data": false, 00:12:49.149 "copy": true, 00:12:49.149 "nvme_iov_md": false 00:12:49.149 }, 00:12:49.149 "memory_domains": [ 00:12:49.149 { 00:12:49.149 "dma_device_id": "system", 00:12:49.149 "dma_device_type": 1 00:12:49.149 }, 00:12:49.149 { 00:12:49.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.149 "dma_device_type": 2 00:12:49.149 } 00:12:49.149 ], 00:12:49.149 "driver_specific": {} 00:12:49.149 }' 00:12:49.149 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.407 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:49.666 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:49.666 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.666 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:49.666 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:49.666 22:19:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:49.925 [2024-07-12 22:20:00.065776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:49.925 [2024-07-12 22:20:00.065804] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:49.925 [2024-07-12 22:20:00.065844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.925 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.183 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.183 "name": "Existed_Raid", 00:12:50.183 "uuid": "0c36ba8f-c724-4260-812f-ff222e8442cc", 00:12:50.183 "strip_size_kb": 64, 00:12:50.183 "state": "offline", 00:12:50.183 "raid_level": "concat", 00:12:50.183 "superblock": true, 00:12:50.183 "num_base_bdevs": 2, 00:12:50.183 "num_base_bdevs_discovered": 1, 00:12:50.183 "num_base_bdevs_operational": 1, 00:12:50.183 "base_bdevs_list": [ 00:12:50.183 { 00:12:50.183 "name": null, 00:12:50.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.183 "is_configured": false, 00:12:50.183 "data_offset": 2048, 00:12:50.183 "data_size": 63488 00:12:50.183 }, 00:12:50.183 { 00:12:50.183 "name": "BaseBdev2", 00:12:50.183 "uuid": "95d517c4-b62c-4463-bb17-cb9dca5a5df1", 00:12:50.183 "is_configured": true, 00:12:50.183 "data_offset": 2048, 00:12:50.183 "data_size": 63488 00:12:50.183 } 00:12:50.183 ] 00:12:50.183 }' 00:12:50.183 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.183 22:20:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:50.751 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:50.751 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:50.751 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.751 22:20:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:51.010 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:51.010 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:51.010 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:51.010 [2024-07-12 22:20:01.334212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:51.010 [2024-07-12 22:20:01.334271] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afc000 name Existed_Raid, state offline 00:12:51.268 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:51.268 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:51.268 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.268 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3428982 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3428982 ']' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3428982 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3428982 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3428982' 00:12:51.526 killing process with pid 3428982 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3428982 00:12:51.526 [2024-07-12 22:20:01.667427] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:51.526 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3428982 00:12:51.527 [2024-07-12 22:20:01.668301] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:51.785 22:20:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:51.785 00:12:51.785 real 0m10.570s 00:12:51.785 user 0m18.804s 00:12:51.785 sys 0m1.952s 00:12:51.785 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:51.785 22:20:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.785 ************************************ 00:12:51.785 END TEST raid_state_function_test_sb 00:12:51.785 ************************************ 00:12:51.785 22:20:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:51.785 22:20:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:51.785 22:20:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:51.785 22:20:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:51.785 22:20:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:51.785 ************************************ 00:12:51.785 START TEST raid_superblock_test 00:12:51.785 ************************************ 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3430617 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3430617 /var/tmp/spdk-raid.sock 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3430617 ']' 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:51.785 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:51.785 22:20:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.785 [2024-07-12 22:20:02.011902] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:12:51.786 [2024-07-12 22:20:02.011977] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3430617 ] 00:12:52.045 [2024-07-12 22:20:02.142681] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:52.045 [2024-07-12 22:20:02.250631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:52.045 [2024-07-12 22:20:02.323484] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:52.045 [2024-07-12 22:20:02.323523] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:52.613 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:52.871 22:20:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:52.871 malloc1 00:12:52.871 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:53.130 [2024-07-12 22:20:03.405776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:53.130 [2024-07-12 22:20:03.405828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:53.130 [2024-07-12 22:20:03.405850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1025570 00:12:53.130 [2024-07-12 22:20:03.405862] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:53.130 [2024-07-12 22:20:03.407655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:53.130 [2024-07-12 22:20:03.407686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:53.130 pt1 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:53.130 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:53.389 malloc2 00:12:53.389 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:53.648 [2024-07-12 22:20:03.901103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:53.648 [2024-07-12 22:20:03.901153] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:53.648 [2024-07-12 22:20:03.901172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1026970 00:12:53.648 [2024-07-12 22:20:03.901185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:53.648 [2024-07-12 22:20:03.902837] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:53.648 [2024-07-12 22:20:03.902865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:53.648 pt2 00:12:53.648 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:53.648 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:53.648 22:20:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:53.908 [2024-07-12 22:20:04.141767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:53.908 [2024-07-12 22:20:04.143123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:53.908 [2024-07-12 22:20:04.143274] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11c9270 00:12:53.908 [2024-07-12 22:20:04.143287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:53.908 [2024-07-12 22:20:04.143488] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11bec10 00:12:53.908 [2024-07-12 22:20:04.143638] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11c9270 00:12:53.908 [2024-07-12 22:20:04.143648] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11c9270 00:12:53.908 [2024-07-12 22:20:04.143753] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.908 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.167 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.167 "name": "raid_bdev1", 00:12:54.167 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:12:54.167 "strip_size_kb": 64, 00:12:54.167 "state": "online", 00:12:54.167 "raid_level": "concat", 00:12:54.167 "superblock": true, 00:12:54.167 "num_base_bdevs": 2, 00:12:54.167 "num_base_bdevs_discovered": 2, 00:12:54.167 "num_base_bdevs_operational": 2, 00:12:54.167 "base_bdevs_list": [ 00:12:54.167 { 00:12:54.167 "name": "pt1", 00:12:54.167 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:54.167 "is_configured": true, 00:12:54.167 "data_offset": 2048, 00:12:54.167 "data_size": 63488 00:12:54.167 }, 00:12:54.167 { 00:12:54.167 "name": "pt2", 00:12:54.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.167 "is_configured": true, 00:12:54.167 "data_offset": 2048, 00:12:54.167 "data_size": 63488 00:12:54.167 } 00:12:54.167 ] 00:12:54.167 }' 00:12:54.167 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.167 22:20:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:54.735 22:20:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:54.994 [2024-07-12 22:20:05.192752] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:54.994 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:54.994 "name": "raid_bdev1", 00:12:54.994 "aliases": [ 00:12:54.994 "063d9639-7c31-43dc-aaa8-5d223a07d5b4" 00:12:54.994 ], 00:12:54.994 "product_name": "Raid Volume", 00:12:54.994 "block_size": 512, 00:12:54.994 "num_blocks": 126976, 00:12:54.994 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:12:54.994 "assigned_rate_limits": { 00:12:54.994 "rw_ios_per_sec": 0, 00:12:54.994 "rw_mbytes_per_sec": 0, 00:12:54.994 "r_mbytes_per_sec": 0, 00:12:54.994 "w_mbytes_per_sec": 0 00:12:54.994 }, 00:12:54.994 "claimed": false, 00:12:54.994 "zoned": false, 00:12:54.994 "supported_io_types": { 00:12:54.994 "read": true, 00:12:54.994 "write": true, 00:12:54.994 "unmap": true, 00:12:54.994 "flush": true, 00:12:54.994 "reset": true, 00:12:54.994 "nvme_admin": false, 00:12:54.994 "nvme_io": false, 00:12:54.994 "nvme_io_md": false, 00:12:54.994 "write_zeroes": true, 00:12:54.994 "zcopy": false, 00:12:54.994 "get_zone_info": false, 00:12:54.994 "zone_management": false, 00:12:54.994 "zone_append": false, 00:12:54.995 "compare": false, 00:12:54.995 "compare_and_write": false, 00:12:54.995 "abort": false, 00:12:54.995 "seek_hole": false, 00:12:54.995 "seek_data": false, 00:12:54.995 "copy": false, 00:12:54.995 "nvme_iov_md": false 00:12:54.995 }, 00:12:54.995 "memory_domains": [ 00:12:54.995 { 00:12:54.995 "dma_device_id": "system", 00:12:54.995 "dma_device_type": 1 00:12:54.995 }, 00:12:54.995 { 00:12:54.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.995 "dma_device_type": 2 00:12:54.995 }, 00:12:54.995 { 00:12:54.995 "dma_device_id": "system", 00:12:54.995 "dma_device_type": 1 00:12:54.995 }, 00:12:54.995 { 00:12:54.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.995 "dma_device_type": 2 00:12:54.995 } 00:12:54.995 ], 00:12:54.995 "driver_specific": { 00:12:54.995 "raid": { 00:12:54.995 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:12:54.995 "strip_size_kb": 64, 00:12:54.995 "state": "online", 00:12:54.995 "raid_level": "concat", 00:12:54.995 "superblock": true, 00:12:54.995 "num_base_bdevs": 2, 00:12:54.995 "num_base_bdevs_discovered": 2, 00:12:54.995 "num_base_bdevs_operational": 2, 00:12:54.995 "base_bdevs_list": [ 00:12:54.995 { 00:12:54.995 "name": "pt1", 00:12:54.995 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:54.995 "is_configured": true, 00:12:54.995 "data_offset": 2048, 00:12:54.995 "data_size": 63488 00:12:54.995 }, 00:12:54.995 { 00:12:54.995 "name": "pt2", 00:12:54.995 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.995 "is_configured": true, 00:12:54.995 "data_offset": 2048, 00:12:54.995 "data_size": 63488 00:12:54.995 } 00:12:54.995 ] 00:12:54.995 } 00:12:54.995 } 00:12:54.995 }' 00:12:54.995 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:54.995 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:54.995 pt2' 00:12:54.995 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:54.995 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:54.995 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:55.255 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:55.255 "name": "pt1", 00:12:55.255 "aliases": [ 00:12:55.255 "00000000-0000-0000-0000-000000000001" 00:12:55.255 ], 00:12:55.255 "product_name": "passthru", 00:12:55.255 "block_size": 512, 00:12:55.255 "num_blocks": 65536, 00:12:55.255 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:55.255 "assigned_rate_limits": { 00:12:55.255 "rw_ios_per_sec": 0, 00:12:55.255 "rw_mbytes_per_sec": 0, 00:12:55.255 "r_mbytes_per_sec": 0, 00:12:55.255 "w_mbytes_per_sec": 0 00:12:55.255 }, 00:12:55.255 "claimed": true, 00:12:55.255 "claim_type": "exclusive_write", 00:12:55.255 "zoned": false, 00:12:55.255 "supported_io_types": { 00:12:55.255 "read": true, 00:12:55.255 "write": true, 00:12:55.255 "unmap": true, 00:12:55.255 "flush": true, 00:12:55.255 "reset": true, 00:12:55.255 "nvme_admin": false, 00:12:55.255 "nvme_io": false, 00:12:55.255 "nvme_io_md": false, 00:12:55.255 "write_zeroes": true, 00:12:55.255 "zcopy": true, 00:12:55.255 "get_zone_info": false, 00:12:55.255 "zone_management": false, 00:12:55.255 "zone_append": false, 00:12:55.255 "compare": false, 00:12:55.255 "compare_and_write": false, 00:12:55.255 "abort": true, 00:12:55.255 "seek_hole": false, 00:12:55.255 "seek_data": false, 00:12:55.255 "copy": true, 00:12:55.255 "nvme_iov_md": false 00:12:55.255 }, 00:12:55.255 "memory_domains": [ 00:12:55.255 { 00:12:55.255 "dma_device_id": "system", 00:12:55.255 "dma_device_type": 1 00:12:55.255 }, 00:12:55.255 { 00:12:55.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.255 "dma_device_type": 2 00:12:55.255 } 00:12:55.255 ], 00:12:55.255 "driver_specific": { 00:12:55.255 "passthru": { 00:12:55.255 "name": "pt1", 00:12:55.255 "base_bdev_name": "malloc1" 00:12:55.255 } 00:12:55.255 } 00:12:55.255 }' 00:12:55.255 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.255 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.514 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.773 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:55.773 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:55.773 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:55.773 22:20:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:55.773 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:55.773 "name": "pt2", 00:12:55.773 "aliases": [ 00:12:55.773 "00000000-0000-0000-0000-000000000002" 00:12:55.773 ], 00:12:55.773 "product_name": "passthru", 00:12:55.773 "block_size": 512, 00:12:55.773 "num_blocks": 65536, 00:12:55.773 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:55.773 "assigned_rate_limits": { 00:12:55.773 "rw_ios_per_sec": 0, 00:12:55.773 "rw_mbytes_per_sec": 0, 00:12:55.773 "r_mbytes_per_sec": 0, 00:12:55.773 "w_mbytes_per_sec": 0 00:12:55.773 }, 00:12:55.773 "claimed": true, 00:12:55.773 "claim_type": "exclusive_write", 00:12:55.773 "zoned": false, 00:12:55.773 "supported_io_types": { 00:12:55.773 "read": true, 00:12:55.773 "write": true, 00:12:55.773 "unmap": true, 00:12:55.773 "flush": true, 00:12:55.773 "reset": true, 00:12:55.773 "nvme_admin": false, 00:12:55.773 "nvme_io": false, 00:12:55.773 "nvme_io_md": false, 00:12:55.773 "write_zeroes": true, 00:12:55.773 "zcopy": true, 00:12:55.773 "get_zone_info": false, 00:12:55.773 "zone_management": false, 00:12:55.773 "zone_append": false, 00:12:55.773 "compare": false, 00:12:55.773 "compare_and_write": false, 00:12:55.773 "abort": true, 00:12:55.773 "seek_hole": false, 00:12:55.773 "seek_data": false, 00:12:55.773 "copy": true, 00:12:55.773 "nvme_iov_md": false 00:12:55.773 }, 00:12:55.773 "memory_domains": [ 00:12:55.773 { 00:12:55.773 "dma_device_id": "system", 00:12:55.773 "dma_device_type": 1 00:12:55.773 }, 00:12:55.773 { 00:12:55.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.773 "dma_device_type": 2 00:12:55.773 } 00:12:55.773 ], 00:12:55.773 "driver_specific": { 00:12:55.773 "passthru": { 00:12:55.773 "name": "pt2", 00:12:55.773 "base_bdev_name": "malloc2" 00:12:55.773 } 00:12:55.773 } 00:12:55.773 }' 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.031 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:56.290 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:56.550 [2024-07-12 22:20:06.684695] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.550 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=063d9639-7c31-43dc-aaa8-5d223a07d5b4 00:12:56.550 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 063d9639-7c31-43dc-aaa8-5d223a07d5b4 ']' 00:12:56.550 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:56.810 [2024-07-12 22:20:06.933104] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:56.810 [2024-07-12 22:20:06.933127] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:56.810 [2024-07-12 22:20:06.933181] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:56.810 [2024-07-12 22:20:06.933226] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:56.810 [2024-07-12 22:20:06.933239] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11c9270 name raid_bdev1, state offline 00:12:56.810 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.810 22:20:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:56.810 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:57.069 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:57.069 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:57.069 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:57.069 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:57.069 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:57.328 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:57.328 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:57.587 22:20:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:57.845 [2024-07-12 22:20:08.060056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:57.845 [2024-07-12 22:20:08.061444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:57.845 [2024-07-12 22:20:08.061503] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:57.845 [2024-07-12 22:20:08.061542] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:57.845 [2024-07-12 22:20:08.061561] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:57.845 [2024-07-12 22:20:08.061572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11c8ff0 name raid_bdev1, state configuring 00:12:57.845 request: 00:12:57.845 { 00:12:57.845 "name": "raid_bdev1", 00:12:57.845 "raid_level": "concat", 00:12:57.845 "base_bdevs": [ 00:12:57.845 "malloc1", 00:12:57.845 "malloc2" 00:12:57.845 ], 00:12:57.845 "strip_size_kb": 64, 00:12:57.845 "superblock": false, 00:12:57.845 "method": "bdev_raid_create", 00:12:57.845 "req_id": 1 00:12:57.845 } 00:12:57.845 Got JSON-RPC error response 00:12:57.845 response: 00:12:57.845 { 00:12:57.845 "code": -17, 00:12:57.846 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:57.846 } 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.846 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:58.104 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:58.104 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:58.104 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:58.363 [2024-07-12 22:20:08.553288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:58.363 [2024-07-12 22:20:08.553336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:58.363 [2024-07-12 22:20:08.553357] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10257a0 00:12:58.363 [2024-07-12 22:20:08.553369] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:58.363 [2024-07-12 22:20:08.554961] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:58.363 [2024-07-12 22:20:08.554990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:58.363 [2024-07-12 22:20:08.555065] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:58.363 [2024-07-12 22:20:08.555091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:58.363 pt1 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:58.363 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.364 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.364 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.364 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.364 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.364 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:58.640 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.640 "name": "raid_bdev1", 00:12:58.640 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:12:58.640 "strip_size_kb": 64, 00:12:58.640 "state": "configuring", 00:12:58.640 "raid_level": "concat", 00:12:58.640 "superblock": true, 00:12:58.640 "num_base_bdevs": 2, 00:12:58.640 "num_base_bdevs_discovered": 1, 00:12:58.640 "num_base_bdevs_operational": 2, 00:12:58.640 "base_bdevs_list": [ 00:12:58.640 { 00:12:58.640 "name": "pt1", 00:12:58.640 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:58.640 "is_configured": true, 00:12:58.640 "data_offset": 2048, 00:12:58.640 "data_size": 63488 00:12:58.640 }, 00:12:58.640 { 00:12:58.640 "name": null, 00:12:58.640 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:58.640 "is_configured": false, 00:12:58.640 "data_offset": 2048, 00:12:58.640 "data_size": 63488 00:12:58.640 } 00:12:58.640 ] 00:12:58.640 }' 00:12:58.640 22:20:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.640 22:20:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.208 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:59.208 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:59.208 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:59.208 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:59.466 [2024-07-12 22:20:09.636163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:59.467 [2024-07-12 22:20:09.636211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:59.467 [2024-07-12 22:20:09.636230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11bf820 00:12:59.467 [2024-07-12 22:20:09.636243] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:59.467 [2024-07-12 22:20:09.636589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:59.467 [2024-07-12 22:20:09.636608] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:59.467 [2024-07-12 22:20:09.636672] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:59.467 [2024-07-12 22:20:09.636692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:59.467 [2024-07-12 22:20:09.636786] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101bec0 00:12:59.467 [2024-07-12 22:20:09.636802] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:59.467 [2024-07-12 22:20:09.636983] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101cf00 00:12:59.467 [2024-07-12 22:20:09.637111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101bec0 00:12:59.467 [2024-07-12 22:20:09.637121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101bec0 00:12:59.467 [2024-07-12 22:20:09.637219] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:59.467 pt2 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.467 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.726 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.726 "name": "raid_bdev1", 00:12:59.726 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:12:59.726 "strip_size_kb": 64, 00:12:59.726 "state": "online", 00:12:59.726 "raid_level": "concat", 00:12:59.726 "superblock": true, 00:12:59.726 "num_base_bdevs": 2, 00:12:59.726 "num_base_bdevs_discovered": 2, 00:12:59.726 "num_base_bdevs_operational": 2, 00:12:59.726 "base_bdevs_list": [ 00:12:59.726 { 00:12:59.726 "name": "pt1", 00:12:59.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:59.726 "is_configured": true, 00:12:59.726 "data_offset": 2048, 00:12:59.726 "data_size": 63488 00:12:59.726 }, 00:12:59.726 { 00:12:59.726 "name": "pt2", 00:12:59.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:59.726 "is_configured": true, 00:12:59.726 "data_offset": 2048, 00:12:59.726 "data_size": 63488 00:12:59.726 } 00:12:59.726 ] 00:12:59.726 }' 00:12:59.726 22:20:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.726 22:20:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:00.296 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:00.555 [2024-07-12 22:20:10.735334] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:00.555 "name": "raid_bdev1", 00:13:00.555 "aliases": [ 00:13:00.555 "063d9639-7c31-43dc-aaa8-5d223a07d5b4" 00:13:00.555 ], 00:13:00.555 "product_name": "Raid Volume", 00:13:00.555 "block_size": 512, 00:13:00.555 "num_blocks": 126976, 00:13:00.555 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:13:00.555 "assigned_rate_limits": { 00:13:00.555 "rw_ios_per_sec": 0, 00:13:00.555 "rw_mbytes_per_sec": 0, 00:13:00.555 "r_mbytes_per_sec": 0, 00:13:00.555 "w_mbytes_per_sec": 0 00:13:00.555 }, 00:13:00.555 "claimed": false, 00:13:00.555 "zoned": false, 00:13:00.555 "supported_io_types": { 00:13:00.555 "read": true, 00:13:00.555 "write": true, 00:13:00.555 "unmap": true, 00:13:00.555 "flush": true, 00:13:00.555 "reset": true, 00:13:00.555 "nvme_admin": false, 00:13:00.555 "nvme_io": false, 00:13:00.555 "nvme_io_md": false, 00:13:00.555 "write_zeroes": true, 00:13:00.555 "zcopy": false, 00:13:00.555 "get_zone_info": false, 00:13:00.555 "zone_management": false, 00:13:00.555 "zone_append": false, 00:13:00.555 "compare": false, 00:13:00.555 "compare_and_write": false, 00:13:00.555 "abort": false, 00:13:00.555 "seek_hole": false, 00:13:00.555 "seek_data": false, 00:13:00.555 "copy": false, 00:13:00.555 "nvme_iov_md": false 00:13:00.555 }, 00:13:00.555 "memory_domains": [ 00:13:00.555 { 00:13:00.555 "dma_device_id": "system", 00:13:00.555 "dma_device_type": 1 00:13:00.555 }, 00:13:00.555 { 00:13:00.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.555 "dma_device_type": 2 00:13:00.555 }, 00:13:00.555 { 00:13:00.555 "dma_device_id": "system", 00:13:00.555 "dma_device_type": 1 00:13:00.555 }, 00:13:00.555 { 00:13:00.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.555 "dma_device_type": 2 00:13:00.555 } 00:13:00.555 ], 00:13:00.555 "driver_specific": { 00:13:00.555 "raid": { 00:13:00.555 "uuid": "063d9639-7c31-43dc-aaa8-5d223a07d5b4", 00:13:00.555 "strip_size_kb": 64, 00:13:00.555 "state": "online", 00:13:00.555 "raid_level": "concat", 00:13:00.555 "superblock": true, 00:13:00.555 "num_base_bdevs": 2, 00:13:00.555 "num_base_bdevs_discovered": 2, 00:13:00.555 "num_base_bdevs_operational": 2, 00:13:00.555 "base_bdevs_list": [ 00:13:00.555 { 00:13:00.555 "name": "pt1", 00:13:00.555 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:00.555 "is_configured": true, 00:13:00.555 "data_offset": 2048, 00:13:00.555 "data_size": 63488 00:13:00.555 }, 00:13:00.555 { 00:13:00.555 "name": "pt2", 00:13:00.555 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:00.555 "is_configured": true, 00:13:00.555 "data_offset": 2048, 00:13:00.555 "data_size": 63488 00:13:00.555 } 00:13:00.555 ] 00:13:00.555 } 00:13:00.555 } 00:13:00.555 }' 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:00.555 pt2' 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:00.555 22:20:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:00.814 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:00.814 "name": "pt1", 00:13:00.814 "aliases": [ 00:13:00.814 "00000000-0000-0000-0000-000000000001" 00:13:00.814 ], 00:13:00.814 "product_name": "passthru", 00:13:00.814 "block_size": 512, 00:13:00.814 "num_blocks": 65536, 00:13:00.814 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:00.814 "assigned_rate_limits": { 00:13:00.814 "rw_ios_per_sec": 0, 00:13:00.814 "rw_mbytes_per_sec": 0, 00:13:00.814 "r_mbytes_per_sec": 0, 00:13:00.814 "w_mbytes_per_sec": 0 00:13:00.814 }, 00:13:00.814 "claimed": true, 00:13:00.814 "claim_type": "exclusive_write", 00:13:00.814 "zoned": false, 00:13:00.814 "supported_io_types": { 00:13:00.814 "read": true, 00:13:00.814 "write": true, 00:13:00.814 "unmap": true, 00:13:00.814 "flush": true, 00:13:00.814 "reset": true, 00:13:00.814 "nvme_admin": false, 00:13:00.814 "nvme_io": false, 00:13:00.814 "nvme_io_md": false, 00:13:00.814 "write_zeroes": true, 00:13:00.814 "zcopy": true, 00:13:00.814 "get_zone_info": false, 00:13:00.814 "zone_management": false, 00:13:00.814 "zone_append": false, 00:13:00.814 "compare": false, 00:13:00.814 "compare_and_write": false, 00:13:00.814 "abort": true, 00:13:00.814 "seek_hole": false, 00:13:00.814 "seek_data": false, 00:13:00.814 "copy": true, 00:13:00.815 "nvme_iov_md": false 00:13:00.815 }, 00:13:00.815 "memory_domains": [ 00:13:00.815 { 00:13:00.815 "dma_device_id": "system", 00:13:00.815 "dma_device_type": 1 00:13:00.815 }, 00:13:00.815 { 00:13:00.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.815 "dma_device_type": 2 00:13:00.815 } 00:13:00.815 ], 00:13:00.815 "driver_specific": { 00:13:00.815 "passthru": { 00:13:00.815 "name": "pt1", 00:13:00.815 "base_bdev_name": "malloc1" 00:13:00.815 } 00:13:00.815 } 00:13:00.815 }' 00:13:00.815 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.815 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:00.815 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.074 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.333 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:01.333 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:01.333 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:01.333 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:01.592 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:01.592 "name": "pt2", 00:13:01.592 "aliases": [ 00:13:01.592 "00000000-0000-0000-0000-000000000002" 00:13:01.592 ], 00:13:01.592 "product_name": "passthru", 00:13:01.592 "block_size": 512, 00:13:01.592 "num_blocks": 65536, 00:13:01.592 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:01.592 "assigned_rate_limits": { 00:13:01.592 "rw_ios_per_sec": 0, 00:13:01.592 "rw_mbytes_per_sec": 0, 00:13:01.592 "r_mbytes_per_sec": 0, 00:13:01.592 "w_mbytes_per_sec": 0 00:13:01.592 }, 00:13:01.592 "claimed": true, 00:13:01.592 "claim_type": "exclusive_write", 00:13:01.592 "zoned": false, 00:13:01.592 "supported_io_types": { 00:13:01.592 "read": true, 00:13:01.592 "write": true, 00:13:01.592 "unmap": true, 00:13:01.592 "flush": true, 00:13:01.592 "reset": true, 00:13:01.592 "nvme_admin": false, 00:13:01.592 "nvme_io": false, 00:13:01.592 "nvme_io_md": false, 00:13:01.592 "write_zeroes": true, 00:13:01.592 "zcopy": true, 00:13:01.592 "get_zone_info": false, 00:13:01.592 "zone_management": false, 00:13:01.592 "zone_append": false, 00:13:01.592 "compare": false, 00:13:01.592 "compare_and_write": false, 00:13:01.592 "abort": true, 00:13:01.592 "seek_hole": false, 00:13:01.592 "seek_data": false, 00:13:01.592 "copy": true, 00:13:01.593 "nvme_iov_md": false 00:13:01.593 }, 00:13:01.593 "memory_domains": [ 00:13:01.593 { 00:13:01.593 "dma_device_id": "system", 00:13:01.593 "dma_device_type": 1 00:13:01.593 }, 00:13:01.593 { 00:13:01.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.593 "dma_device_type": 2 00:13:01.593 } 00:13:01.593 ], 00:13:01.593 "driver_specific": { 00:13:01.593 "passthru": { 00:13:01.593 "name": "pt2", 00:13:01.593 "base_bdev_name": "malloc2" 00:13:01.593 } 00:13:01.593 } 00:13:01.593 }' 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.593 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:01.851 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:01.851 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.851 22:20:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:01.851 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:01.851 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:01.851 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:02.110 [2024-07-12 22:20:12.247347] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 063d9639-7c31-43dc-aaa8-5d223a07d5b4 '!=' 063d9639-7c31-43dc-aaa8-5d223a07d5b4 ']' 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3430617 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3430617 ']' 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3430617 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:02.110 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3430617 00:13:02.111 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:02.111 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:02.111 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3430617' 00:13:02.111 killing process with pid 3430617 00:13:02.111 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3430617 00:13:02.111 [2024-07-12 22:20:12.318818] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:02.111 [2024-07-12 22:20:12.318875] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:02.111 [2024-07-12 22:20:12.318920] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:02.111 [2024-07-12 22:20:12.318938] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101bec0 name raid_bdev1, state offline 00:13:02.111 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3430617 00:13:02.111 [2024-07-12 22:20:12.337803] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:02.370 22:20:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:02.370 00:13:02.370 real 0m10.614s 00:13:02.370 user 0m18.883s 00:13:02.370 sys 0m2.013s 00:13:02.370 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:02.370 22:20:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.370 ************************************ 00:13:02.370 END TEST raid_superblock_test 00:13:02.370 ************************************ 00:13:02.370 22:20:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:02.370 22:20:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:13:02.370 22:20:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:02.370 22:20:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:02.370 22:20:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:02.370 ************************************ 00:13:02.370 START TEST raid_read_error_test 00:13:02.370 ************************************ 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:02.370 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.p9eAor9ull 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3432223 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3432223 /var/tmp/spdk-raid.sock 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3432223 ']' 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:02.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:02.371 22:20:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.630 [2024-07-12 22:20:12.714517] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:02.630 [2024-07-12 22:20:12.714583] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3432223 ] 00:13:02.630 [2024-07-12 22:20:12.844969] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:02.630 [2024-07-12 22:20:12.947201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:02.888 [2024-07-12 22:20:13.005793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.889 [2024-07-12 22:20:13.005830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.456 22:20:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:03.456 22:20:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:03.456 22:20:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:03.456 22:20:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:03.715 BaseBdev1_malloc 00:13:03.715 22:20:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:03.973 true 00:13:03.973 22:20:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:04.232 [2024-07-12 22:20:14.358829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:04.232 [2024-07-12 22:20:14.358873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.232 [2024-07-12 22:20:14.358894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c20d0 00:13:04.232 [2024-07-12 22:20:14.358912] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.232 [2024-07-12 22:20:14.360763] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.232 [2024-07-12 22:20:14.360794] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:04.232 BaseBdev1 00:13:04.232 22:20:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:04.232 22:20:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:04.492 BaseBdev2_malloc 00:13:04.492 22:20:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:04.751 true 00:13:04.751 22:20:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:05.010 [2024-07-12 22:20:15.086581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:05.010 [2024-07-12 22:20:15.086626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:05.010 [2024-07-12 22:20:15.086647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c6910 00:13:05.010 [2024-07-12 22:20:15.086660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:05.010 [2024-07-12 22:20:15.088265] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:05.010 [2024-07-12 22:20:15.088294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:05.010 BaseBdev2 00:13:05.010 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:05.010 [2024-07-12 22:20:15.331268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:05.010 [2024-07-12 22:20:15.332632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:05.010 [2024-07-12 22:20:15.332828] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c8320 00:13:05.010 [2024-07-12 22:20:15.332843] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:05.010 [2024-07-12 22:20:15.333048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c9290 00:13:05.010 [2024-07-12 22:20:15.333198] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c8320 00:13:05.010 [2024-07-12 22:20:15.333208] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13c8320 00:13:05.010 [2024-07-12 22:20:15.333313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.270 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.530 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.530 "name": "raid_bdev1", 00:13:05.530 "uuid": "ec6383ca-2a0d-449c-98a6-d91c35955c5b", 00:13:05.530 "strip_size_kb": 64, 00:13:05.530 "state": "online", 00:13:05.530 "raid_level": "concat", 00:13:05.530 "superblock": true, 00:13:05.530 "num_base_bdevs": 2, 00:13:05.530 "num_base_bdevs_discovered": 2, 00:13:05.530 "num_base_bdevs_operational": 2, 00:13:05.530 "base_bdevs_list": [ 00:13:05.530 { 00:13:05.530 "name": "BaseBdev1", 00:13:05.530 "uuid": "70b5597b-7c31-5e15-85d2-8f44499facaa", 00:13:05.530 "is_configured": true, 00:13:05.530 "data_offset": 2048, 00:13:05.530 "data_size": 63488 00:13:05.530 }, 00:13:05.530 { 00:13:05.530 "name": "BaseBdev2", 00:13:05.530 "uuid": "5586940e-a125-5de8-bc45-4b61c52dbda4", 00:13:05.530 "is_configured": true, 00:13:05.530 "data_offset": 2048, 00:13:05.530 "data_size": 63488 00:13:05.530 } 00:13:05.530 ] 00:13:05.530 }' 00:13:05.530 22:20:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.530 22:20:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.098 22:20:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:06.098 22:20:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:06.098 [2024-07-12 22:20:16.302122] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c39b0 00:13:07.033 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.291 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:07.550 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.550 "name": "raid_bdev1", 00:13:07.550 "uuid": "ec6383ca-2a0d-449c-98a6-d91c35955c5b", 00:13:07.550 "strip_size_kb": 64, 00:13:07.550 "state": "online", 00:13:07.550 "raid_level": "concat", 00:13:07.550 "superblock": true, 00:13:07.550 "num_base_bdevs": 2, 00:13:07.550 "num_base_bdevs_discovered": 2, 00:13:07.550 "num_base_bdevs_operational": 2, 00:13:07.550 "base_bdevs_list": [ 00:13:07.550 { 00:13:07.550 "name": "BaseBdev1", 00:13:07.550 "uuid": "70b5597b-7c31-5e15-85d2-8f44499facaa", 00:13:07.550 "is_configured": true, 00:13:07.550 "data_offset": 2048, 00:13:07.550 "data_size": 63488 00:13:07.550 }, 00:13:07.550 { 00:13:07.550 "name": "BaseBdev2", 00:13:07.550 "uuid": "5586940e-a125-5de8-bc45-4b61c52dbda4", 00:13:07.550 "is_configured": true, 00:13:07.550 "data_offset": 2048, 00:13:07.550 "data_size": 63488 00:13:07.550 } 00:13:07.550 ] 00:13:07.550 }' 00:13:07.550 22:20:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.550 22:20:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.116 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:08.375 [2024-07-12 22:20:18.527228] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:08.375 [2024-07-12 22:20:18.527269] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:08.375 [2024-07-12 22:20:18.530519] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:08.375 [2024-07-12 22:20:18.530550] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:08.375 [2024-07-12 22:20:18.530577] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:08.375 [2024-07-12 22:20:18.530589] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c8320 name raid_bdev1, state offline 00:13:08.375 0 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3432223 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3432223 ']' 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3432223 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3432223 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3432223' 00:13:08.375 killing process with pid 3432223 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3432223 00:13:08.375 [2024-07-12 22:20:18.594490] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:08.375 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3432223 00:13:08.375 [2024-07-12 22:20:18.604746] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.p9eAor9ull 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:13:08.635 00:13:08.635 real 0m6.180s 00:13:08.635 user 0m9.691s 00:13:08.635 sys 0m1.073s 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:08.635 22:20:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.635 ************************************ 00:13:08.635 END TEST raid_read_error_test 00:13:08.635 ************************************ 00:13:08.635 22:20:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:08.635 22:20:18 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:13:08.635 22:20:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:08.635 22:20:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:08.635 22:20:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:08.635 ************************************ 00:13:08.635 START TEST raid_write_error_test 00:13:08.635 ************************************ 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Y82w9UPPgG 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3433151 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3433151 /var/tmp/spdk-raid.sock 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3433151 ']' 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:08.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:08.635 22:20:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.894 [2024-07-12 22:20:18.991325] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:08.894 [2024-07-12 22:20:18.991398] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3433151 ] 00:13:08.894 [2024-07-12 22:20:19.123199] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:09.152 [2024-07-12 22:20:19.225525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:09.152 [2024-07-12 22:20:19.282140] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.152 [2024-07-12 22:20:19.282169] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.719 22:20:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:09.719 22:20:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:09.719 22:20:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:09.719 22:20:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:09.977 BaseBdev1_malloc 00:13:09.977 22:20:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:10.235 true 00:13:10.235 22:20:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:10.493 [2024-07-12 22:20:20.654579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:10.493 [2024-07-12 22:20:20.654626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:10.493 [2024-07-12 22:20:20.654649] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x226c0d0 00:13:10.493 [2024-07-12 22:20:20.654662] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:10.493 [2024-07-12 22:20:20.656411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:10.493 [2024-07-12 22:20:20.656442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:10.493 BaseBdev1 00:13:10.493 22:20:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:10.493 22:20:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:10.751 BaseBdev2_malloc 00:13:10.751 22:20:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:11.009 true 00:13:11.009 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:11.267 [2024-07-12 22:20:21.385219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:11.267 [2024-07-12 22:20:21.385263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:11.267 [2024-07-12 22:20:21.385287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2270910 00:13:11.267 [2024-07-12 22:20:21.385299] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:11.267 [2024-07-12 22:20:21.386735] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:11.267 [2024-07-12 22:20:21.386763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:11.267 BaseBdev2 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:11.267 [2024-07-12 22:20:21.553710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.267 [2024-07-12 22:20:21.555058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:11.267 [2024-07-12 22:20:21.555251] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2272320 00:13:11.267 [2024-07-12 22:20:21.555265] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:11.267 [2024-07-12 22:20:21.555465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2273290 00:13:11.267 [2024-07-12 22:20:21.555613] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2272320 00:13:11.267 [2024-07-12 22:20:21.555623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2272320 00:13:11.267 [2024-07-12 22:20:21.555724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.267 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:11.526 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.526 "name": "raid_bdev1", 00:13:11.526 "uuid": "8b16bf74-425c-4347-9598-a9d9daaf1654", 00:13:11.526 "strip_size_kb": 64, 00:13:11.526 "state": "online", 00:13:11.526 "raid_level": "concat", 00:13:11.526 "superblock": true, 00:13:11.526 "num_base_bdevs": 2, 00:13:11.526 "num_base_bdevs_discovered": 2, 00:13:11.526 "num_base_bdevs_operational": 2, 00:13:11.526 "base_bdevs_list": [ 00:13:11.526 { 00:13:11.526 "name": "BaseBdev1", 00:13:11.526 "uuid": "82ffb24d-58db-5384-bd08-f7791756ce8d", 00:13:11.526 "is_configured": true, 00:13:11.526 "data_offset": 2048, 00:13:11.526 "data_size": 63488 00:13:11.526 }, 00:13:11.526 { 00:13:11.526 "name": "BaseBdev2", 00:13:11.526 "uuid": "905b1c37-2de4-5a40-9082-7fd04570a7af", 00:13:11.526 "is_configured": true, 00:13:11.526 "data_offset": 2048, 00:13:11.526 "data_size": 63488 00:13:11.526 } 00:13:11.526 ] 00:13:11.526 }' 00:13:11.526 22:20:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.526 22:20:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.460 22:20:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:12.460 22:20:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:12.460 [2024-07-12 22:20:22.528567] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x226d9b0 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.395 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:13.653 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.653 "name": "raid_bdev1", 00:13:13.653 "uuid": "8b16bf74-425c-4347-9598-a9d9daaf1654", 00:13:13.653 "strip_size_kb": 64, 00:13:13.653 "state": "online", 00:13:13.653 "raid_level": "concat", 00:13:13.653 "superblock": true, 00:13:13.653 "num_base_bdevs": 2, 00:13:13.653 "num_base_bdevs_discovered": 2, 00:13:13.653 "num_base_bdevs_operational": 2, 00:13:13.653 "base_bdevs_list": [ 00:13:13.653 { 00:13:13.653 "name": "BaseBdev1", 00:13:13.653 "uuid": "82ffb24d-58db-5384-bd08-f7791756ce8d", 00:13:13.653 "is_configured": true, 00:13:13.653 "data_offset": 2048, 00:13:13.653 "data_size": 63488 00:13:13.653 }, 00:13:13.653 { 00:13:13.653 "name": "BaseBdev2", 00:13:13.653 "uuid": "905b1c37-2de4-5a40-9082-7fd04570a7af", 00:13:13.653 "is_configured": true, 00:13:13.653 "data_offset": 2048, 00:13:13.653 "data_size": 63488 00:13:13.653 } 00:13:13.653 ] 00:13:13.653 }' 00:13:13.653 22:20:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.653 22:20:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.219 22:20:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:14.478 [2024-07-12 22:20:24.765003] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:14.478 [2024-07-12 22:20:24.765041] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:14.478 [2024-07-12 22:20:24.768214] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:14.478 [2024-07-12 22:20:24.768245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:14.478 [2024-07-12 22:20:24.768273] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:14.478 [2024-07-12 22:20:24.768284] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2272320 name raid_bdev1, state offline 00:13:14.478 0 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3433151 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3433151 ']' 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3433151 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:14.478 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3433151 00:13:14.737 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:14.737 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:14.737 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3433151' 00:13:14.737 killing process with pid 3433151 00:13:14.737 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3433151 00:13:14.737 [2024-07-12 22:20:24.832720] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:14.737 22:20:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3433151 00:13:14.737 [2024-07-12 22:20:24.843890] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Y82w9UPPgG 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:13:14.995 00:13:14.995 real 0m6.174s 00:13:14.995 user 0m9.648s 00:13:14.995 sys 0m1.064s 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:14.995 22:20:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.995 ************************************ 00:13:14.995 END TEST raid_write_error_test 00:13:14.995 ************************************ 00:13:14.995 22:20:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:14.995 22:20:25 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:14.995 22:20:25 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:13:14.995 22:20:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:14.995 22:20:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:14.995 22:20:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:14.995 ************************************ 00:13:14.995 START TEST raid_state_function_test 00:13:14.995 ************************************ 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3434119 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3434119' 00:13:14.995 Process raid pid: 3434119 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3434119 /var/tmp/spdk-raid.sock 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3434119 ']' 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:14.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:14.995 22:20:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.995 [2024-07-12 22:20:25.270711] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:14.995 [2024-07-12 22:20:25.270845] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:15.253 [2024-07-12 22:20:25.467619] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:15.253 [2024-07-12 22:20:25.567848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:15.579 [2024-07-12 22:20:25.636424] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:15.579 [2024-07-12 22:20:25.636477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:15.841 22:20:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:15.841 22:20:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:15.841 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:16.099 [2024-07-12 22:20:26.387631] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:16.099 [2024-07-12 22:20:26.387676] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:16.099 [2024-07-12 22:20:26.387688] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:16.099 [2024-07-12 22:20:26.387699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.099 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.357 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.357 "name": "Existed_Raid", 00:13:16.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.357 "strip_size_kb": 0, 00:13:16.357 "state": "configuring", 00:13:16.357 "raid_level": "raid1", 00:13:16.357 "superblock": false, 00:13:16.357 "num_base_bdevs": 2, 00:13:16.357 "num_base_bdevs_discovered": 0, 00:13:16.357 "num_base_bdevs_operational": 2, 00:13:16.357 "base_bdevs_list": [ 00:13:16.357 { 00:13:16.357 "name": "BaseBdev1", 00:13:16.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.357 "is_configured": false, 00:13:16.357 "data_offset": 0, 00:13:16.357 "data_size": 0 00:13:16.357 }, 00:13:16.357 { 00:13:16.357 "name": "BaseBdev2", 00:13:16.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.357 "is_configured": false, 00:13:16.357 "data_offset": 0, 00:13:16.357 "data_size": 0 00:13:16.357 } 00:13:16.357 ] 00:13:16.357 }' 00:13:16.357 22:20:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.357 22:20:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.922 22:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:17.180 [2024-07-12 22:20:27.378138] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:17.180 [2024-07-12 22:20:27.378167] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2454a80 name Existed_Raid, state configuring 00:13:17.180 22:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:17.437 [2024-07-12 22:20:27.622790] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:17.438 [2024-07-12 22:20:27.622821] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:17.438 [2024-07-12 22:20:27.622831] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:17.438 [2024-07-12 22:20:27.622842] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:17.438 22:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:17.696 [2024-07-12 22:20:27.877395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:17.696 BaseBdev1 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:17.696 22:20:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:17.953 [ 00:13:17.953 { 00:13:17.953 "name": "BaseBdev1", 00:13:17.953 "aliases": [ 00:13:17.953 "fab0e287-3d8a-4063-93bc-c8c78d073314" 00:13:17.953 ], 00:13:17.953 "product_name": "Malloc disk", 00:13:17.953 "block_size": 512, 00:13:17.953 "num_blocks": 65536, 00:13:17.953 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:17.953 "assigned_rate_limits": { 00:13:17.953 "rw_ios_per_sec": 0, 00:13:17.953 "rw_mbytes_per_sec": 0, 00:13:17.953 "r_mbytes_per_sec": 0, 00:13:17.953 "w_mbytes_per_sec": 0 00:13:17.953 }, 00:13:17.953 "claimed": true, 00:13:17.953 "claim_type": "exclusive_write", 00:13:17.953 "zoned": false, 00:13:17.953 "supported_io_types": { 00:13:17.953 "read": true, 00:13:17.953 "write": true, 00:13:17.953 "unmap": true, 00:13:17.953 "flush": true, 00:13:17.953 "reset": true, 00:13:17.953 "nvme_admin": false, 00:13:17.953 "nvme_io": false, 00:13:17.953 "nvme_io_md": false, 00:13:17.953 "write_zeroes": true, 00:13:17.953 "zcopy": true, 00:13:17.953 "get_zone_info": false, 00:13:17.953 "zone_management": false, 00:13:17.953 "zone_append": false, 00:13:17.953 "compare": false, 00:13:17.953 "compare_and_write": false, 00:13:17.953 "abort": true, 00:13:17.953 "seek_hole": false, 00:13:17.953 "seek_data": false, 00:13:17.953 "copy": true, 00:13:17.953 "nvme_iov_md": false 00:13:17.953 }, 00:13:17.953 "memory_domains": [ 00:13:17.953 { 00:13:17.953 "dma_device_id": "system", 00:13:17.953 "dma_device_type": 1 00:13:17.953 }, 00:13:17.953 { 00:13:17.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.953 "dma_device_type": 2 00:13:17.953 } 00:13:17.953 ], 00:13:17.953 "driver_specific": {} 00:13:17.953 } 00:13:17.953 ] 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.953 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.210 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.210 "name": "Existed_Raid", 00:13:18.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.210 "strip_size_kb": 0, 00:13:18.210 "state": "configuring", 00:13:18.210 "raid_level": "raid1", 00:13:18.210 "superblock": false, 00:13:18.210 "num_base_bdevs": 2, 00:13:18.210 "num_base_bdevs_discovered": 1, 00:13:18.210 "num_base_bdevs_operational": 2, 00:13:18.210 "base_bdevs_list": [ 00:13:18.210 { 00:13:18.210 "name": "BaseBdev1", 00:13:18.210 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:18.210 "is_configured": true, 00:13:18.210 "data_offset": 0, 00:13:18.210 "data_size": 65536 00:13:18.210 }, 00:13:18.210 { 00:13:18.210 "name": "BaseBdev2", 00:13:18.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.210 "is_configured": false, 00:13:18.210 "data_offset": 0, 00:13:18.210 "data_size": 0 00:13:18.210 } 00:13:18.210 ] 00:13:18.210 }' 00:13:18.210 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.210 22:20:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.825 22:20:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:19.082 [2024-07-12 22:20:29.220972] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:19.082 [2024-07-12 22:20:29.221010] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2454350 name Existed_Raid, state configuring 00:13:19.082 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:19.339 [2024-07-12 22:20:29.465635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:19.339 [2024-07-12 22:20:29.467171] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:19.339 [2024-07-12 22:20:29.467204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.339 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.597 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.597 "name": "Existed_Raid", 00:13:19.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.597 "strip_size_kb": 0, 00:13:19.597 "state": "configuring", 00:13:19.597 "raid_level": "raid1", 00:13:19.597 "superblock": false, 00:13:19.597 "num_base_bdevs": 2, 00:13:19.597 "num_base_bdevs_discovered": 1, 00:13:19.597 "num_base_bdevs_operational": 2, 00:13:19.597 "base_bdevs_list": [ 00:13:19.597 { 00:13:19.597 "name": "BaseBdev1", 00:13:19.597 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:19.597 "is_configured": true, 00:13:19.597 "data_offset": 0, 00:13:19.597 "data_size": 65536 00:13:19.597 }, 00:13:19.597 { 00:13:19.597 "name": "BaseBdev2", 00:13:19.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.597 "is_configured": false, 00:13:19.597 "data_offset": 0, 00:13:19.597 "data_size": 0 00:13:19.597 } 00:13:19.597 ] 00:13:19.597 }' 00:13:19.597 22:20:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.597 22:20:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.163 22:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:20.421 [2024-07-12 22:20:30.547847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:20.421 [2024-07-12 22:20:30.547884] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2455000 00:13:20.421 [2024-07-12 22:20:30.547893] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:20.421 [2024-07-12 22:20:30.548092] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x236f0c0 00:13:20.421 [2024-07-12 22:20:30.548211] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2455000 00:13:20.421 [2024-07-12 22:20:30.548222] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2455000 00:13:20.421 [2024-07-12 22:20:30.548387] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:20.421 BaseBdev2 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:20.421 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:20.422 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.680 22:20:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:20.941 [ 00:13:20.941 { 00:13:20.941 "name": "BaseBdev2", 00:13:20.941 "aliases": [ 00:13:20.941 "e6fd56f8-2ad5-463f-ad61-40c97151c510" 00:13:20.941 ], 00:13:20.941 "product_name": "Malloc disk", 00:13:20.941 "block_size": 512, 00:13:20.941 "num_blocks": 65536, 00:13:20.941 "uuid": "e6fd56f8-2ad5-463f-ad61-40c97151c510", 00:13:20.941 "assigned_rate_limits": { 00:13:20.941 "rw_ios_per_sec": 0, 00:13:20.941 "rw_mbytes_per_sec": 0, 00:13:20.941 "r_mbytes_per_sec": 0, 00:13:20.941 "w_mbytes_per_sec": 0 00:13:20.941 }, 00:13:20.941 "claimed": true, 00:13:20.941 "claim_type": "exclusive_write", 00:13:20.941 "zoned": false, 00:13:20.941 "supported_io_types": { 00:13:20.941 "read": true, 00:13:20.941 "write": true, 00:13:20.941 "unmap": true, 00:13:20.941 "flush": true, 00:13:20.941 "reset": true, 00:13:20.941 "nvme_admin": false, 00:13:20.941 "nvme_io": false, 00:13:20.941 "nvme_io_md": false, 00:13:20.941 "write_zeroes": true, 00:13:20.941 "zcopy": true, 00:13:20.941 "get_zone_info": false, 00:13:20.941 "zone_management": false, 00:13:20.941 "zone_append": false, 00:13:20.941 "compare": false, 00:13:20.941 "compare_and_write": false, 00:13:20.941 "abort": true, 00:13:20.941 "seek_hole": false, 00:13:20.941 "seek_data": false, 00:13:20.941 "copy": true, 00:13:20.941 "nvme_iov_md": false 00:13:20.941 }, 00:13:20.941 "memory_domains": [ 00:13:20.941 { 00:13:20.941 "dma_device_id": "system", 00:13:20.941 "dma_device_type": 1 00:13:20.941 }, 00:13:20.941 { 00:13:20.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.941 "dma_device_type": 2 00:13:20.941 } 00:13:20.941 ], 00:13:20.941 "driver_specific": {} 00:13:20.941 } 00:13:20.941 ] 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.941 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.200 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.200 "name": "Existed_Raid", 00:13:21.200 "uuid": "877b2320-3b85-4e98-9c75-31f5152bf51d", 00:13:21.201 "strip_size_kb": 0, 00:13:21.201 "state": "online", 00:13:21.201 "raid_level": "raid1", 00:13:21.201 "superblock": false, 00:13:21.201 "num_base_bdevs": 2, 00:13:21.201 "num_base_bdevs_discovered": 2, 00:13:21.201 "num_base_bdevs_operational": 2, 00:13:21.201 "base_bdevs_list": [ 00:13:21.201 { 00:13:21.201 "name": "BaseBdev1", 00:13:21.201 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:21.201 "is_configured": true, 00:13:21.201 "data_offset": 0, 00:13:21.201 "data_size": 65536 00:13:21.201 }, 00:13:21.201 { 00:13:21.201 "name": "BaseBdev2", 00:13:21.201 "uuid": "e6fd56f8-2ad5-463f-ad61-40c97151c510", 00:13:21.201 "is_configured": true, 00:13:21.201 "data_offset": 0, 00:13:21.201 "data_size": 65536 00:13:21.201 } 00:13:21.201 ] 00:13:21.201 }' 00:13:21.201 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.201 22:20:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:21.768 22:20:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:22.026 [2024-07-12 22:20:32.124284] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.026 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:22.026 "name": "Existed_Raid", 00:13:22.026 "aliases": [ 00:13:22.026 "877b2320-3b85-4e98-9c75-31f5152bf51d" 00:13:22.026 ], 00:13:22.026 "product_name": "Raid Volume", 00:13:22.026 "block_size": 512, 00:13:22.026 "num_blocks": 65536, 00:13:22.026 "uuid": "877b2320-3b85-4e98-9c75-31f5152bf51d", 00:13:22.026 "assigned_rate_limits": { 00:13:22.026 "rw_ios_per_sec": 0, 00:13:22.026 "rw_mbytes_per_sec": 0, 00:13:22.026 "r_mbytes_per_sec": 0, 00:13:22.026 "w_mbytes_per_sec": 0 00:13:22.026 }, 00:13:22.026 "claimed": false, 00:13:22.026 "zoned": false, 00:13:22.026 "supported_io_types": { 00:13:22.026 "read": true, 00:13:22.026 "write": true, 00:13:22.026 "unmap": false, 00:13:22.026 "flush": false, 00:13:22.026 "reset": true, 00:13:22.026 "nvme_admin": false, 00:13:22.026 "nvme_io": false, 00:13:22.026 "nvme_io_md": false, 00:13:22.026 "write_zeroes": true, 00:13:22.026 "zcopy": false, 00:13:22.026 "get_zone_info": false, 00:13:22.026 "zone_management": false, 00:13:22.026 "zone_append": false, 00:13:22.026 "compare": false, 00:13:22.026 "compare_and_write": false, 00:13:22.026 "abort": false, 00:13:22.026 "seek_hole": false, 00:13:22.026 "seek_data": false, 00:13:22.026 "copy": false, 00:13:22.026 "nvme_iov_md": false 00:13:22.026 }, 00:13:22.026 "memory_domains": [ 00:13:22.026 { 00:13:22.026 "dma_device_id": "system", 00:13:22.026 "dma_device_type": 1 00:13:22.026 }, 00:13:22.026 { 00:13:22.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.026 "dma_device_type": 2 00:13:22.026 }, 00:13:22.026 { 00:13:22.026 "dma_device_id": "system", 00:13:22.026 "dma_device_type": 1 00:13:22.026 }, 00:13:22.026 { 00:13:22.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.026 "dma_device_type": 2 00:13:22.026 } 00:13:22.026 ], 00:13:22.026 "driver_specific": { 00:13:22.026 "raid": { 00:13:22.026 "uuid": "877b2320-3b85-4e98-9c75-31f5152bf51d", 00:13:22.026 "strip_size_kb": 0, 00:13:22.026 "state": "online", 00:13:22.026 "raid_level": "raid1", 00:13:22.026 "superblock": false, 00:13:22.026 "num_base_bdevs": 2, 00:13:22.026 "num_base_bdevs_discovered": 2, 00:13:22.026 "num_base_bdevs_operational": 2, 00:13:22.026 "base_bdevs_list": [ 00:13:22.026 { 00:13:22.026 "name": "BaseBdev1", 00:13:22.026 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:22.026 "is_configured": true, 00:13:22.026 "data_offset": 0, 00:13:22.026 "data_size": 65536 00:13:22.026 }, 00:13:22.026 { 00:13:22.026 "name": "BaseBdev2", 00:13:22.026 "uuid": "e6fd56f8-2ad5-463f-ad61-40c97151c510", 00:13:22.026 "is_configured": true, 00:13:22.026 "data_offset": 0, 00:13:22.026 "data_size": 65536 00:13:22.026 } 00:13:22.026 ] 00:13:22.026 } 00:13:22.026 } 00:13:22.026 }' 00:13:22.026 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:22.026 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:22.026 BaseBdev2' 00:13:22.027 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:22.027 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:22.027 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:22.284 "name": "BaseBdev1", 00:13:22.284 "aliases": [ 00:13:22.284 "fab0e287-3d8a-4063-93bc-c8c78d073314" 00:13:22.284 ], 00:13:22.284 "product_name": "Malloc disk", 00:13:22.284 "block_size": 512, 00:13:22.284 "num_blocks": 65536, 00:13:22.284 "uuid": "fab0e287-3d8a-4063-93bc-c8c78d073314", 00:13:22.284 "assigned_rate_limits": { 00:13:22.284 "rw_ios_per_sec": 0, 00:13:22.284 "rw_mbytes_per_sec": 0, 00:13:22.284 "r_mbytes_per_sec": 0, 00:13:22.284 "w_mbytes_per_sec": 0 00:13:22.284 }, 00:13:22.284 "claimed": true, 00:13:22.284 "claim_type": "exclusive_write", 00:13:22.284 "zoned": false, 00:13:22.284 "supported_io_types": { 00:13:22.284 "read": true, 00:13:22.284 "write": true, 00:13:22.284 "unmap": true, 00:13:22.284 "flush": true, 00:13:22.284 "reset": true, 00:13:22.284 "nvme_admin": false, 00:13:22.284 "nvme_io": false, 00:13:22.284 "nvme_io_md": false, 00:13:22.284 "write_zeroes": true, 00:13:22.284 "zcopy": true, 00:13:22.284 "get_zone_info": false, 00:13:22.284 "zone_management": false, 00:13:22.284 "zone_append": false, 00:13:22.284 "compare": false, 00:13:22.284 "compare_and_write": false, 00:13:22.284 "abort": true, 00:13:22.284 "seek_hole": false, 00:13:22.284 "seek_data": false, 00:13:22.284 "copy": true, 00:13:22.284 "nvme_iov_md": false 00:13:22.284 }, 00:13:22.284 "memory_domains": [ 00:13:22.284 { 00:13:22.284 "dma_device_id": "system", 00:13:22.284 "dma_device_type": 1 00:13:22.284 }, 00:13:22.284 { 00:13:22.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.284 "dma_device_type": 2 00:13:22.284 } 00:13:22.284 ], 00:13:22.284 "driver_specific": {} 00:13:22.284 }' 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:22.284 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:22.542 22:20:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:22.798 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:22.798 "name": "BaseBdev2", 00:13:22.798 "aliases": [ 00:13:22.798 "e6fd56f8-2ad5-463f-ad61-40c97151c510" 00:13:22.798 ], 00:13:22.798 "product_name": "Malloc disk", 00:13:22.798 "block_size": 512, 00:13:22.798 "num_blocks": 65536, 00:13:22.798 "uuid": "e6fd56f8-2ad5-463f-ad61-40c97151c510", 00:13:22.798 "assigned_rate_limits": { 00:13:22.798 "rw_ios_per_sec": 0, 00:13:22.798 "rw_mbytes_per_sec": 0, 00:13:22.798 "r_mbytes_per_sec": 0, 00:13:22.798 "w_mbytes_per_sec": 0 00:13:22.798 }, 00:13:22.798 "claimed": true, 00:13:22.798 "claim_type": "exclusive_write", 00:13:22.798 "zoned": false, 00:13:22.798 "supported_io_types": { 00:13:22.798 "read": true, 00:13:22.798 "write": true, 00:13:22.798 "unmap": true, 00:13:22.798 "flush": true, 00:13:22.798 "reset": true, 00:13:22.798 "nvme_admin": false, 00:13:22.798 "nvme_io": false, 00:13:22.798 "nvme_io_md": false, 00:13:22.798 "write_zeroes": true, 00:13:22.798 "zcopy": true, 00:13:22.798 "get_zone_info": false, 00:13:22.798 "zone_management": false, 00:13:22.798 "zone_append": false, 00:13:22.798 "compare": false, 00:13:22.798 "compare_and_write": false, 00:13:22.798 "abort": true, 00:13:22.798 "seek_hole": false, 00:13:22.798 "seek_data": false, 00:13:22.798 "copy": true, 00:13:22.798 "nvme_iov_md": false 00:13:22.798 }, 00:13:22.798 "memory_domains": [ 00:13:22.798 { 00:13:22.798 "dma_device_id": "system", 00:13:22.798 "dma_device_type": 1 00:13:22.798 }, 00:13:22.798 { 00:13:22.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.798 "dma_device_type": 2 00:13:22.798 } 00:13:22.798 ], 00:13:22.798 "driver_specific": {} 00:13:22.798 }' 00:13:22.798 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:22.798 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:22.798 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:22.798 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.054 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:23.312 [2024-07-12 22:20:33.495710] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.312 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.313 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.313 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.571 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.571 "name": "Existed_Raid", 00:13:23.571 "uuid": "877b2320-3b85-4e98-9c75-31f5152bf51d", 00:13:23.571 "strip_size_kb": 0, 00:13:23.571 "state": "online", 00:13:23.571 "raid_level": "raid1", 00:13:23.571 "superblock": false, 00:13:23.571 "num_base_bdevs": 2, 00:13:23.571 "num_base_bdevs_discovered": 1, 00:13:23.571 "num_base_bdevs_operational": 1, 00:13:23.571 "base_bdevs_list": [ 00:13:23.571 { 00:13:23.571 "name": null, 00:13:23.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.571 "is_configured": false, 00:13:23.571 "data_offset": 0, 00:13:23.571 "data_size": 65536 00:13:23.571 }, 00:13:23.571 { 00:13:23.571 "name": "BaseBdev2", 00:13:23.571 "uuid": "e6fd56f8-2ad5-463f-ad61-40c97151c510", 00:13:23.571 "is_configured": true, 00:13:23.571 "data_offset": 0, 00:13:23.571 "data_size": 65536 00:13:23.571 } 00:13:23.571 ] 00:13:23.571 }' 00:13:23.571 22:20:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.571 22:20:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.137 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:24.137 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:24.137 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.137 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:24.395 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:24.395 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:24.395 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:24.653 [2024-07-12 22:20:34.732060] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:24.653 [2024-07-12 22:20:34.732146] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:24.653 [2024-07-12 22:20:34.744608] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:24.653 [2024-07-12 22:20:34.744644] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:24.653 [2024-07-12 22:20:34.744657] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2455000 name Existed_Raid, state offline 00:13:24.653 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:24.653 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:24.653 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.653 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3434119 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3434119 ']' 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3434119 00:13:24.912 22:20:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3434119 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3434119' 00:13:24.912 killing process with pid 3434119 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3434119 00:13:24.912 [2024-07-12 22:20:35.044424] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:24.912 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3434119 00:13:24.912 [2024-07-12 22:20:35.045295] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:25.171 00:13:25.171 real 0m10.092s 00:13:25.171 user 0m17.863s 00:13:25.171 sys 0m2.002s 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.171 ************************************ 00:13:25.171 END TEST raid_state_function_test 00:13:25.171 ************************************ 00:13:25.171 22:20:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:25.171 22:20:35 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:25.171 22:20:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:25.171 22:20:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:25.171 22:20:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:25.171 ************************************ 00:13:25.171 START TEST raid_state_function_test_sb 00:13:25.171 ************************************ 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3435590 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3435590' 00:13:25.171 Process raid pid: 3435590 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3435590 /var/tmp/spdk-raid.sock 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3435590 ']' 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:25.171 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:25.171 22:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:25.171 [2024-07-12 22:20:35.393990] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:25.171 [2024-07-12 22:20:35.394057] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:25.430 [2024-07-12 22:20:35.526253] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:25.430 [2024-07-12 22:20:35.633033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:25.430 [2024-07-12 22:20:35.697987] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:25.430 [2024-07-12 22:20:35.698024] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:26.364 [2024-07-12 22:20:36.537532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:26.364 [2024-07-12 22:20:36.537574] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:26.364 [2024-07-12 22:20:36.537586] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:26.364 [2024-07-12 22:20:36.537602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.364 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.622 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.622 "name": "Existed_Raid", 00:13:26.622 "uuid": "e3d0d49d-db68-45c2-b239-38fb23581870", 00:13:26.622 "strip_size_kb": 0, 00:13:26.622 "state": "configuring", 00:13:26.622 "raid_level": "raid1", 00:13:26.622 "superblock": true, 00:13:26.622 "num_base_bdevs": 2, 00:13:26.622 "num_base_bdevs_discovered": 0, 00:13:26.622 "num_base_bdevs_operational": 2, 00:13:26.622 "base_bdevs_list": [ 00:13:26.622 { 00:13:26.622 "name": "BaseBdev1", 00:13:26.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.622 "is_configured": false, 00:13:26.622 "data_offset": 0, 00:13:26.622 "data_size": 0 00:13:26.622 }, 00:13:26.622 { 00:13:26.622 "name": "BaseBdev2", 00:13:26.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.622 "is_configured": false, 00:13:26.622 "data_offset": 0, 00:13:26.622 "data_size": 0 00:13:26.622 } 00:13:26.622 ] 00:13:26.622 }' 00:13:26.622 22:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.622 22:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:27.185 22:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:27.441 [2024-07-12 22:20:37.560101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:27.441 [2024-07-12 22:20:37.560136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3da80 name Existed_Raid, state configuring 00:13:27.441 22:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:27.441 [2024-07-12 22:20:37.728573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:27.441 [2024-07-12 22:20:37.728605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:27.441 [2024-07-12 22:20:37.728615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:27.441 [2024-07-12 22:20:37.728627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:27.441 22:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:27.698 [2024-07-12 22:20:37.910979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:27.698 BaseBdev1 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:27.698 22:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.955 22:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:28.213 [ 00:13:28.213 { 00:13:28.213 "name": "BaseBdev1", 00:13:28.213 "aliases": [ 00:13:28.213 "bb461914-2f82-483c-b173-841df71d500e" 00:13:28.213 ], 00:13:28.213 "product_name": "Malloc disk", 00:13:28.213 "block_size": 512, 00:13:28.213 "num_blocks": 65536, 00:13:28.213 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:28.213 "assigned_rate_limits": { 00:13:28.213 "rw_ios_per_sec": 0, 00:13:28.213 "rw_mbytes_per_sec": 0, 00:13:28.213 "r_mbytes_per_sec": 0, 00:13:28.213 "w_mbytes_per_sec": 0 00:13:28.213 }, 00:13:28.213 "claimed": true, 00:13:28.213 "claim_type": "exclusive_write", 00:13:28.213 "zoned": false, 00:13:28.213 "supported_io_types": { 00:13:28.213 "read": true, 00:13:28.213 "write": true, 00:13:28.213 "unmap": true, 00:13:28.213 "flush": true, 00:13:28.213 "reset": true, 00:13:28.213 "nvme_admin": false, 00:13:28.213 "nvme_io": false, 00:13:28.213 "nvme_io_md": false, 00:13:28.213 "write_zeroes": true, 00:13:28.213 "zcopy": true, 00:13:28.213 "get_zone_info": false, 00:13:28.213 "zone_management": false, 00:13:28.213 "zone_append": false, 00:13:28.213 "compare": false, 00:13:28.213 "compare_and_write": false, 00:13:28.213 "abort": true, 00:13:28.213 "seek_hole": false, 00:13:28.213 "seek_data": false, 00:13:28.213 "copy": true, 00:13:28.213 "nvme_iov_md": false 00:13:28.213 }, 00:13:28.213 "memory_domains": [ 00:13:28.213 { 00:13:28.213 "dma_device_id": "system", 00:13:28.213 "dma_device_type": 1 00:13:28.213 }, 00:13:28.213 { 00:13:28.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.213 "dma_device_type": 2 00:13:28.213 } 00:13:28.213 ], 00:13:28.213 "driver_specific": {} 00:13:28.213 } 00:13:28.213 ] 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.213 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.471 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.471 "name": "Existed_Raid", 00:13:28.471 "uuid": "fbe7e2f7-183d-4239-8592-0a1572eb6575", 00:13:28.471 "strip_size_kb": 0, 00:13:28.471 "state": "configuring", 00:13:28.471 "raid_level": "raid1", 00:13:28.471 "superblock": true, 00:13:28.471 "num_base_bdevs": 2, 00:13:28.471 "num_base_bdevs_discovered": 1, 00:13:28.471 "num_base_bdevs_operational": 2, 00:13:28.471 "base_bdevs_list": [ 00:13:28.471 { 00:13:28.471 "name": "BaseBdev1", 00:13:28.471 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:28.471 "is_configured": true, 00:13:28.471 "data_offset": 2048, 00:13:28.471 "data_size": 63488 00:13:28.471 }, 00:13:28.471 { 00:13:28.471 "name": "BaseBdev2", 00:13:28.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.471 "is_configured": false, 00:13:28.471 "data_offset": 0, 00:13:28.471 "data_size": 0 00:13:28.471 } 00:13:28.471 ] 00:13:28.471 }' 00:13:28.471 22:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.471 22:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.035 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:29.035 [2024-07-12 22:20:39.350783] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:29.035 [2024-07-12 22:20:39.350821] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3d350 name Existed_Raid, state configuring 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:29.293 [2024-07-12 22:20:39.535320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:29.293 [2024-07-12 22:20:39.536802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:29.293 [2024-07-12 22:20:39.536833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.293 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.551 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.551 "name": "Existed_Raid", 00:13:29.551 "uuid": "c6349d83-e577-4015-93f6-0a0b7a8b5ea3", 00:13:29.551 "strip_size_kb": 0, 00:13:29.551 "state": "configuring", 00:13:29.551 "raid_level": "raid1", 00:13:29.551 "superblock": true, 00:13:29.551 "num_base_bdevs": 2, 00:13:29.551 "num_base_bdevs_discovered": 1, 00:13:29.551 "num_base_bdevs_operational": 2, 00:13:29.551 "base_bdevs_list": [ 00:13:29.551 { 00:13:29.551 "name": "BaseBdev1", 00:13:29.551 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:29.551 "is_configured": true, 00:13:29.551 "data_offset": 2048, 00:13:29.551 "data_size": 63488 00:13:29.551 }, 00:13:29.551 { 00:13:29.551 "name": "BaseBdev2", 00:13:29.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.551 "is_configured": false, 00:13:29.551 "data_offset": 0, 00:13:29.551 "data_size": 0 00:13:29.551 } 00:13:29.551 ] 00:13:29.551 }' 00:13:29.552 22:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.552 22:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:30.118 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:30.377 [2024-07-12 22:20:40.549449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:30.377 [2024-07-12 22:20:40.549602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3e000 00:13:30.377 [2024-07-12 22:20:40.549616] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:30.377 [2024-07-12 22:20:40.549788] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19580c0 00:13:30.377 [2024-07-12 22:20:40.549910] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3e000 00:13:30.377 [2024-07-12 22:20:40.549921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a3e000 00:13:30.377 [2024-07-12 22:20:40.550040] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.377 BaseBdev2 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:30.377 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:30.636 [ 00:13:30.636 { 00:13:30.636 "name": "BaseBdev2", 00:13:30.636 "aliases": [ 00:13:30.636 "58e277cb-02ec-4365-989a-dd249b0d43f8" 00:13:30.636 ], 00:13:30.636 "product_name": "Malloc disk", 00:13:30.636 "block_size": 512, 00:13:30.636 "num_blocks": 65536, 00:13:30.636 "uuid": "58e277cb-02ec-4365-989a-dd249b0d43f8", 00:13:30.636 "assigned_rate_limits": { 00:13:30.636 "rw_ios_per_sec": 0, 00:13:30.636 "rw_mbytes_per_sec": 0, 00:13:30.636 "r_mbytes_per_sec": 0, 00:13:30.636 "w_mbytes_per_sec": 0 00:13:30.636 }, 00:13:30.636 "claimed": true, 00:13:30.636 "claim_type": "exclusive_write", 00:13:30.636 "zoned": false, 00:13:30.636 "supported_io_types": { 00:13:30.636 "read": true, 00:13:30.636 "write": true, 00:13:30.636 "unmap": true, 00:13:30.636 "flush": true, 00:13:30.636 "reset": true, 00:13:30.636 "nvme_admin": false, 00:13:30.636 "nvme_io": false, 00:13:30.636 "nvme_io_md": false, 00:13:30.636 "write_zeroes": true, 00:13:30.636 "zcopy": true, 00:13:30.636 "get_zone_info": false, 00:13:30.636 "zone_management": false, 00:13:30.636 "zone_append": false, 00:13:30.636 "compare": false, 00:13:30.636 "compare_and_write": false, 00:13:30.636 "abort": true, 00:13:30.636 "seek_hole": false, 00:13:30.636 "seek_data": false, 00:13:30.636 "copy": true, 00:13:30.636 "nvme_iov_md": false 00:13:30.636 }, 00:13:30.636 "memory_domains": [ 00:13:30.636 { 00:13:30.636 "dma_device_id": "system", 00:13:30.636 "dma_device_type": 1 00:13:30.636 }, 00:13:30.636 { 00:13:30.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.636 "dma_device_type": 2 00:13:30.636 } 00:13:30.636 ], 00:13:30.636 "driver_specific": {} 00:13:30.636 } 00:13:30.636 ] 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.636 22:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.981 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.981 "name": "Existed_Raid", 00:13:30.981 "uuid": "c6349d83-e577-4015-93f6-0a0b7a8b5ea3", 00:13:30.981 "strip_size_kb": 0, 00:13:30.981 "state": "online", 00:13:30.981 "raid_level": "raid1", 00:13:30.981 "superblock": true, 00:13:30.981 "num_base_bdevs": 2, 00:13:30.981 "num_base_bdevs_discovered": 2, 00:13:30.981 "num_base_bdevs_operational": 2, 00:13:30.981 "base_bdevs_list": [ 00:13:30.981 { 00:13:30.981 "name": "BaseBdev1", 00:13:30.981 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:30.981 "is_configured": true, 00:13:30.981 "data_offset": 2048, 00:13:30.981 "data_size": 63488 00:13:30.981 }, 00:13:30.981 { 00:13:30.981 "name": "BaseBdev2", 00:13:30.981 "uuid": "58e277cb-02ec-4365-989a-dd249b0d43f8", 00:13:30.981 "is_configured": true, 00:13:30.981 "data_offset": 2048, 00:13:30.981 "data_size": 63488 00:13:30.981 } 00:13:30.981 ] 00:13:30.981 }' 00:13:30.981 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.981 22:20:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:31.550 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:31.809 [2024-07-12 22:20:41.957449] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.809 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:31.809 "name": "Existed_Raid", 00:13:31.809 "aliases": [ 00:13:31.809 "c6349d83-e577-4015-93f6-0a0b7a8b5ea3" 00:13:31.809 ], 00:13:31.809 "product_name": "Raid Volume", 00:13:31.809 "block_size": 512, 00:13:31.809 "num_blocks": 63488, 00:13:31.809 "uuid": "c6349d83-e577-4015-93f6-0a0b7a8b5ea3", 00:13:31.809 "assigned_rate_limits": { 00:13:31.809 "rw_ios_per_sec": 0, 00:13:31.809 "rw_mbytes_per_sec": 0, 00:13:31.809 "r_mbytes_per_sec": 0, 00:13:31.809 "w_mbytes_per_sec": 0 00:13:31.809 }, 00:13:31.809 "claimed": false, 00:13:31.809 "zoned": false, 00:13:31.809 "supported_io_types": { 00:13:31.809 "read": true, 00:13:31.809 "write": true, 00:13:31.809 "unmap": false, 00:13:31.809 "flush": false, 00:13:31.809 "reset": true, 00:13:31.809 "nvme_admin": false, 00:13:31.809 "nvme_io": false, 00:13:31.809 "nvme_io_md": false, 00:13:31.809 "write_zeroes": true, 00:13:31.809 "zcopy": false, 00:13:31.809 "get_zone_info": false, 00:13:31.809 "zone_management": false, 00:13:31.809 "zone_append": false, 00:13:31.809 "compare": false, 00:13:31.809 "compare_and_write": false, 00:13:31.809 "abort": false, 00:13:31.809 "seek_hole": false, 00:13:31.809 "seek_data": false, 00:13:31.809 "copy": false, 00:13:31.809 "nvme_iov_md": false 00:13:31.809 }, 00:13:31.809 "memory_domains": [ 00:13:31.809 { 00:13:31.809 "dma_device_id": "system", 00:13:31.809 "dma_device_type": 1 00:13:31.809 }, 00:13:31.809 { 00:13:31.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.809 "dma_device_type": 2 00:13:31.809 }, 00:13:31.809 { 00:13:31.809 "dma_device_id": "system", 00:13:31.809 "dma_device_type": 1 00:13:31.809 }, 00:13:31.809 { 00:13:31.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.809 "dma_device_type": 2 00:13:31.809 } 00:13:31.809 ], 00:13:31.809 "driver_specific": { 00:13:31.809 "raid": { 00:13:31.809 "uuid": "c6349d83-e577-4015-93f6-0a0b7a8b5ea3", 00:13:31.809 "strip_size_kb": 0, 00:13:31.809 "state": "online", 00:13:31.809 "raid_level": "raid1", 00:13:31.809 "superblock": true, 00:13:31.809 "num_base_bdevs": 2, 00:13:31.809 "num_base_bdevs_discovered": 2, 00:13:31.809 "num_base_bdevs_operational": 2, 00:13:31.809 "base_bdevs_list": [ 00:13:31.809 { 00:13:31.809 "name": "BaseBdev1", 00:13:31.809 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:31.809 "is_configured": true, 00:13:31.809 "data_offset": 2048, 00:13:31.809 "data_size": 63488 00:13:31.809 }, 00:13:31.809 { 00:13:31.809 "name": "BaseBdev2", 00:13:31.809 "uuid": "58e277cb-02ec-4365-989a-dd249b0d43f8", 00:13:31.809 "is_configured": true, 00:13:31.809 "data_offset": 2048, 00:13:31.809 "data_size": 63488 00:13:31.809 } 00:13:31.809 ] 00:13:31.809 } 00:13:31.809 } 00:13:31.809 }' 00:13:31.809 22:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:31.809 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:31.809 BaseBdev2' 00:13:31.809 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.809 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:31.809 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.068 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.068 "name": "BaseBdev1", 00:13:32.068 "aliases": [ 00:13:32.068 "bb461914-2f82-483c-b173-841df71d500e" 00:13:32.068 ], 00:13:32.068 "product_name": "Malloc disk", 00:13:32.068 "block_size": 512, 00:13:32.068 "num_blocks": 65536, 00:13:32.068 "uuid": "bb461914-2f82-483c-b173-841df71d500e", 00:13:32.068 "assigned_rate_limits": { 00:13:32.068 "rw_ios_per_sec": 0, 00:13:32.068 "rw_mbytes_per_sec": 0, 00:13:32.068 "r_mbytes_per_sec": 0, 00:13:32.068 "w_mbytes_per_sec": 0 00:13:32.068 }, 00:13:32.068 "claimed": true, 00:13:32.068 "claim_type": "exclusive_write", 00:13:32.068 "zoned": false, 00:13:32.068 "supported_io_types": { 00:13:32.068 "read": true, 00:13:32.068 "write": true, 00:13:32.068 "unmap": true, 00:13:32.068 "flush": true, 00:13:32.068 "reset": true, 00:13:32.068 "nvme_admin": false, 00:13:32.068 "nvme_io": false, 00:13:32.068 "nvme_io_md": false, 00:13:32.068 "write_zeroes": true, 00:13:32.068 "zcopy": true, 00:13:32.068 "get_zone_info": false, 00:13:32.068 "zone_management": false, 00:13:32.068 "zone_append": false, 00:13:32.068 "compare": false, 00:13:32.068 "compare_and_write": false, 00:13:32.068 "abort": true, 00:13:32.068 "seek_hole": false, 00:13:32.068 "seek_data": false, 00:13:32.068 "copy": true, 00:13:32.069 "nvme_iov_md": false 00:13:32.069 }, 00:13:32.069 "memory_domains": [ 00:13:32.069 { 00:13:32.069 "dma_device_id": "system", 00:13:32.069 "dma_device_type": 1 00:13:32.069 }, 00:13:32.069 { 00:13:32.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.069 "dma_device_type": 2 00:13:32.069 } 00:13:32.069 ], 00:13:32.069 "driver_specific": {} 00:13:32.069 }' 00:13:32.069 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.069 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.069 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.069 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:32.328 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.587 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.587 "name": "BaseBdev2", 00:13:32.587 "aliases": [ 00:13:32.587 "58e277cb-02ec-4365-989a-dd249b0d43f8" 00:13:32.587 ], 00:13:32.587 "product_name": "Malloc disk", 00:13:32.587 "block_size": 512, 00:13:32.587 "num_blocks": 65536, 00:13:32.587 "uuid": "58e277cb-02ec-4365-989a-dd249b0d43f8", 00:13:32.587 "assigned_rate_limits": { 00:13:32.587 "rw_ios_per_sec": 0, 00:13:32.587 "rw_mbytes_per_sec": 0, 00:13:32.587 "r_mbytes_per_sec": 0, 00:13:32.587 "w_mbytes_per_sec": 0 00:13:32.587 }, 00:13:32.587 "claimed": true, 00:13:32.587 "claim_type": "exclusive_write", 00:13:32.587 "zoned": false, 00:13:32.587 "supported_io_types": { 00:13:32.587 "read": true, 00:13:32.587 "write": true, 00:13:32.587 "unmap": true, 00:13:32.587 "flush": true, 00:13:32.587 "reset": true, 00:13:32.587 "nvme_admin": false, 00:13:32.587 "nvme_io": false, 00:13:32.587 "nvme_io_md": false, 00:13:32.587 "write_zeroes": true, 00:13:32.587 "zcopy": true, 00:13:32.587 "get_zone_info": false, 00:13:32.587 "zone_management": false, 00:13:32.587 "zone_append": false, 00:13:32.587 "compare": false, 00:13:32.587 "compare_and_write": false, 00:13:32.587 "abort": true, 00:13:32.587 "seek_hole": false, 00:13:32.587 "seek_data": false, 00:13:32.587 "copy": true, 00:13:32.587 "nvme_iov_md": false 00:13:32.587 }, 00:13:32.587 "memory_domains": [ 00:13:32.587 { 00:13:32.587 "dma_device_id": "system", 00:13:32.587 "dma_device_type": 1 00:13:32.587 }, 00:13:32.587 { 00:13:32.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.587 "dma_device_type": 2 00:13:32.587 } 00:13:32.587 ], 00:13:32.587 "driver_specific": {} 00:13:32.587 }' 00:13:32.587 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.845 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.845 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.845 22:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.845 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.102 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.102 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.102 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:33.361 [2024-07-12 22:20:43.449182] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.361 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.620 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.620 "name": "Existed_Raid", 00:13:33.620 "uuid": "c6349d83-e577-4015-93f6-0a0b7a8b5ea3", 00:13:33.620 "strip_size_kb": 0, 00:13:33.620 "state": "online", 00:13:33.620 "raid_level": "raid1", 00:13:33.620 "superblock": true, 00:13:33.620 "num_base_bdevs": 2, 00:13:33.620 "num_base_bdevs_discovered": 1, 00:13:33.620 "num_base_bdevs_operational": 1, 00:13:33.620 "base_bdevs_list": [ 00:13:33.620 { 00:13:33.620 "name": null, 00:13:33.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:33.620 "is_configured": false, 00:13:33.620 "data_offset": 2048, 00:13:33.620 "data_size": 63488 00:13:33.620 }, 00:13:33.620 { 00:13:33.620 "name": "BaseBdev2", 00:13:33.620 "uuid": "58e277cb-02ec-4365-989a-dd249b0d43f8", 00:13:33.620 "is_configured": true, 00:13:33.620 "data_offset": 2048, 00:13:33.620 "data_size": 63488 00:13:33.620 } 00:13:33.620 ] 00:13:33.620 }' 00:13:33.620 22:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.620 22:20:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:34.188 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:34.188 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:34.188 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.188 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:34.448 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:34.448 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:34.448 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:34.448 [2024-07-12 22:20:44.761739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:34.448 [2024-07-12 22:20:44.761817] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:34.448 [2024-07-12 22:20:44.772589] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:34.448 [2024-07-12 22:20:44.772620] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:34.448 [2024-07-12 22:20:44.772632] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3e000 name Existed_Raid, state offline 00:13:34.707 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:34.707 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:34.707 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.707 22:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3435590 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3435590 ']' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3435590 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3435590 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3435590' 00:13:34.966 killing process with pid 3435590 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3435590 00:13:34.966 [2024-07-12 22:20:45.083326] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:34.966 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3435590 00:13:34.966 [2024-07-12 22:20:45.084209] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:35.226 22:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:35.226 00:13:35.226 real 0m9.972s 00:13:35.226 user 0m17.642s 00:13:35.226 sys 0m1.924s 00:13:35.226 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:35.226 22:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.226 ************************************ 00:13:35.226 END TEST raid_state_function_test_sb 00:13:35.226 ************************************ 00:13:35.226 22:20:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:35.226 22:20:45 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:35.226 22:20:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:35.226 22:20:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:35.226 22:20:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:35.226 ************************************ 00:13:35.226 START TEST raid_superblock_test 00:13:35.226 ************************************ 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3437219 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3437219 /var/tmp/spdk-raid.sock 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3437219 ']' 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:35.226 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:35.226 22:20:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.226 [2024-07-12 22:20:45.438360] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:35.226 [2024-07-12 22:20:45.438426] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3437219 ] 00:13:35.485 [2024-07-12 22:20:45.578030] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:35.485 [2024-07-12 22:20:45.711797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:35.485 [2024-07-12 22:20:45.777781] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.485 [2024-07-12 22:20:45.777829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:36.422 malloc1 00:13:36.422 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:36.682 [2024-07-12 22:20:46.905744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:36.682 [2024-07-12 22:20:46.905791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:36.682 [2024-07-12 22:20:46.905813] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd4570 00:13:36.682 [2024-07-12 22:20:46.905826] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:36.682 [2024-07-12 22:20:46.907512] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:36.682 [2024-07-12 22:20:46.907544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:36.682 pt1 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:36.682 22:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:36.941 malloc2 00:13:36.941 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:37.200 [2024-07-12 22:20:47.339738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:37.200 [2024-07-12 22:20:47.339786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:37.200 [2024-07-12 22:20:47.339804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd5970 00:13:37.200 [2024-07-12 22:20:47.339816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:37.200 [2024-07-12 22:20:47.341489] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:37.200 [2024-07-12 22:20:47.341519] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:37.200 pt2 00:13:37.200 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:37.200 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:37.200 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:37.459 [2024-07-12 22:20:47.580513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:37.460 [2024-07-12 22:20:47.581873] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:37.460 [2024-07-12 22:20:47.582038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1178270 00:13:37.460 [2024-07-12 22:20:47.582052] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:37.460 [2024-07-12 22:20:47.582259] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfcc0e0 00:13:37.460 [2024-07-12 22:20:47.582410] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1178270 00:13:37.460 [2024-07-12 22:20:47.582420] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1178270 00:13:37.460 [2024-07-12 22:20:47.582527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.460 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:37.720 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.720 "name": "raid_bdev1", 00:13:37.720 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:37.720 "strip_size_kb": 0, 00:13:37.720 "state": "online", 00:13:37.720 "raid_level": "raid1", 00:13:37.720 "superblock": true, 00:13:37.720 "num_base_bdevs": 2, 00:13:37.720 "num_base_bdevs_discovered": 2, 00:13:37.720 "num_base_bdevs_operational": 2, 00:13:37.720 "base_bdevs_list": [ 00:13:37.720 { 00:13:37.720 "name": "pt1", 00:13:37.720 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:37.720 "is_configured": true, 00:13:37.720 "data_offset": 2048, 00:13:37.720 "data_size": 63488 00:13:37.720 }, 00:13:37.720 { 00:13:37.720 "name": "pt2", 00:13:37.720 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:37.720 "is_configured": true, 00:13:37.720 "data_offset": 2048, 00:13:37.720 "data_size": 63488 00:13:37.720 } 00:13:37.720 ] 00:13:37.720 }' 00:13:37.720 22:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.720 22:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:38.288 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:38.288 [2024-07-12 22:20:48.611483] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:38.547 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:38.547 "name": "raid_bdev1", 00:13:38.547 "aliases": [ 00:13:38.547 "8a1e635a-9143-4d13-a8f3-7ef401c0078b" 00:13:38.547 ], 00:13:38.547 "product_name": "Raid Volume", 00:13:38.547 "block_size": 512, 00:13:38.547 "num_blocks": 63488, 00:13:38.547 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:38.547 "assigned_rate_limits": { 00:13:38.547 "rw_ios_per_sec": 0, 00:13:38.547 "rw_mbytes_per_sec": 0, 00:13:38.547 "r_mbytes_per_sec": 0, 00:13:38.547 "w_mbytes_per_sec": 0 00:13:38.547 }, 00:13:38.547 "claimed": false, 00:13:38.547 "zoned": false, 00:13:38.547 "supported_io_types": { 00:13:38.547 "read": true, 00:13:38.547 "write": true, 00:13:38.547 "unmap": false, 00:13:38.547 "flush": false, 00:13:38.547 "reset": true, 00:13:38.547 "nvme_admin": false, 00:13:38.547 "nvme_io": false, 00:13:38.547 "nvme_io_md": false, 00:13:38.547 "write_zeroes": true, 00:13:38.547 "zcopy": false, 00:13:38.547 "get_zone_info": false, 00:13:38.547 "zone_management": false, 00:13:38.547 "zone_append": false, 00:13:38.547 "compare": false, 00:13:38.547 "compare_and_write": false, 00:13:38.547 "abort": false, 00:13:38.547 "seek_hole": false, 00:13:38.547 "seek_data": false, 00:13:38.547 "copy": false, 00:13:38.547 "nvme_iov_md": false 00:13:38.547 }, 00:13:38.547 "memory_domains": [ 00:13:38.547 { 00:13:38.547 "dma_device_id": "system", 00:13:38.547 "dma_device_type": 1 00:13:38.547 }, 00:13:38.547 { 00:13:38.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.547 "dma_device_type": 2 00:13:38.547 }, 00:13:38.547 { 00:13:38.547 "dma_device_id": "system", 00:13:38.547 "dma_device_type": 1 00:13:38.547 }, 00:13:38.547 { 00:13:38.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.547 "dma_device_type": 2 00:13:38.547 } 00:13:38.547 ], 00:13:38.547 "driver_specific": { 00:13:38.547 "raid": { 00:13:38.547 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:38.547 "strip_size_kb": 0, 00:13:38.547 "state": "online", 00:13:38.547 "raid_level": "raid1", 00:13:38.547 "superblock": true, 00:13:38.547 "num_base_bdevs": 2, 00:13:38.547 "num_base_bdevs_discovered": 2, 00:13:38.547 "num_base_bdevs_operational": 2, 00:13:38.548 "base_bdevs_list": [ 00:13:38.548 { 00:13:38.548 "name": "pt1", 00:13:38.548 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:38.548 "is_configured": true, 00:13:38.548 "data_offset": 2048, 00:13:38.548 "data_size": 63488 00:13:38.548 }, 00:13:38.548 { 00:13:38.548 "name": "pt2", 00:13:38.548 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:38.548 "is_configured": true, 00:13:38.548 "data_offset": 2048, 00:13:38.548 "data_size": 63488 00:13:38.548 } 00:13:38.548 ] 00:13:38.548 } 00:13:38.548 } 00:13:38.548 }' 00:13:38.548 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:38.548 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:38.548 pt2' 00:13:38.548 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:38.548 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:38.548 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.807 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.807 "name": "pt1", 00:13:38.807 "aliases": [ 00:13:38.807 "00000000-0000-0000-0000-000000000001" 00:13:38.807 ], 00:13:38.807 "product_name": "passthru", 00:13:38.807 "block_size": 512, 00:13:38.807 "num_blocks": 65536, 00:13:38.807 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:38.807 "assigned_rate_limits": { 00:13:38.807 "rw_ios_per_sec": 0, 00:13:38.807 "rw_mbytes_per_sec": 0, 00:13:38.807 "r_mbytes_per_sec": 0, 00:13:38.807 "w_mbytes_per_sec": 0 00:13:38.807 }, 00:13:38.807 "claimed": true, 00:13:38.807 "claim_type": "exclusive_write", 00:13:38.807 "zoned": false, 00:13:38.807 "supported_io_types": { 00:13:38.807 "read": true, 00:13:38.807 "write": true, 00:13:38.807 "unmap": true, 00:13:38.807 "flush": true, 00:13:38.807 "reset": true, 00:13:38.807 "nvme_admin": false, 00:13:38.807 "nvme_io": false, 00:13:38.807 "nvme_io_md": false, 00:13:38.807 "write_zeroes": true, 00:13:38.807 "zcopy": true, 00:13:38.807 "get_zone_info": false, 00:13:38.807 "zone_management": false, 00:13:38.807 "zone_append": false, 00:13:38.807 "compare": false, 00:13:38.807 "compare_and_write": false, 00:13:38.807 "abort": true, 00:13:38.807 "seek_hole": false, 00:13:38.807 "seek_data": false, 00:13:38.807 "copy": true, 00:13:38.807 "nvme_iov_md": false 00:13:38.807 }, 00:13:38.807 "memory_domains": [ 00:13:38.807 { 00:13:38.807 "dma_device_id": "system", 00:13:38.807 "dma_device_type": 1 00:13:38.807 }, 00:13:38.807 { 00:13:38.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.807 "dma_device_type": 2 00:13:38.807 } 00:13:38.807 ], 00:13:38.807 "driver_specific": { 00:13:38.807 "passthru": { 00:13:38.807 "name": "pt1", 00:13:38.807 "base_bdev_name": "malloc1" 00:13:38.807 } 00:13:38.807 } 00:13:38.807 }' 00:13:38.807 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.807 22:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.807 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.807 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.807 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.807 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.807 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:39.065 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:39.323 "name": "pt2", 00:13:39.323 "aliases": [ 00:13:39.323 "00000000-0000-0000-0000-000000000002" 00:13:39.323 ], 00:13:39.323 "product_name": "passthru", 00:13:39.323 "block_size": 512, 00:13:39.323 "num_blocks": 65536, 00:13:39.323 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:39.323 "assigned_rate_limits": { 00:13:39.323 "rw_ios_per_sec": 0, 00:13:39.323 "rw_mbytes_per_sec": 0, 00:13:39.323 "r_mbytes_per_sec": 0, 00:13:39.323 "w_mbytes_per_sec": 0 00:13:39.323 }, 00:13:39.323 "claimed": true, 00:13:39.323 "claim_type": "exclusive_write", 00:13:39.323 "zoned": false, 00:13:39.323 "supported_io_types": { 00:13:39.323 "read": true, 00:13:39.323 "write": true, 00:13:39.323 "unmap": true, 00:13:39.323 "flush": true, 00:13:39.323 "reset": true, 00:13:39.323 "nvme_admin": false, 00:13:39.323 "nvme_io": false, 00:13:39.323 "nvme_io_md": false, 00:13:39.323 "write_zeroes": true, 00:13:39.323 "zcopy": true, 00:13:39.323 "get_zone_info": false, 00:13:39.323 "zone_management": false, 00:13:39.323 "zone_append": false, 00:13:39.323 "compare": false, 00:13:39.323 "compare_and_write": false, 00:13:39.323 "abort": true, 00:13:39.323 "seek_hole": false, 00:13:39.323 "seek_data": false, 00:13:39.323 "copy": true, 00:13:39.323 "nvme_iov_md": false 00:13:39.323 }, 00:13:39.323 "memory_domains": [ 00:13:39.323 { 00:13:39.323 "dma_device_id": "system", 00:13:39.323 "dma_device_type": 1 00:13:39.323 }, 00:13:39.323 { 00:13:39.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.323 "dma_device_type": 2 00:13:39.323 } 00:13:39.323 ], 00:13:39.323 "driver_specific": { 00:13:39.323 "passthru": { 00:13:39.323 "name": "pt2", 00:13:39.323 "base_bdev_name": "malloc2" 00:13:39.323 } 00:13:39.323 } 00:13:39.323 }' 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.323 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:39.581 22:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:39.838 [2024-07-12 22:20:50.075367] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:39.838 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8a1e635a-9143-4d13-a8f3-7ef401c0078b 00:13:39.838 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8a1e635a-9143-4d13-a8f3-7ef401c0078b ']' 00:13:39.839 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:40.097 [2024-07-12 22:20:50.319751] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:40.097 [2024-07-12 22:20:50.319775] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.097 [2024-07-12 22:20:50.319830] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.097 [2024-07-12 22:20:50.319886] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:40.097 [2024-07-12 22:20:50.319898] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1178270 name raid_bdev1, state offline 00:13:40.097 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.097 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:40.355 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:40.355 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:40.355 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:40.355 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:40.614 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:40.614 22:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:40.872 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:40.872 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:41.131 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:41.390 [2024-07-12 22:20:51.546947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:41.390 [2024-07-12 22:20:51.548342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:41.390 [2024-07-12 22:20:51.548399] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:41.390 [2024-07-12 22:20:51.548438] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:41.390 [2024-07-12 22:20:51.548458] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:41.390 [2024-07-12 22:20:51.548468] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1177ff0 name raid_bdev1, state configuring 00:13:41.390 request: 00:13:41.390 { 00:13:41.390 "name": "raid_bdev1", 00:13:41.390 "raid_level": "raid1", 00:13:41.390 "base_bdevs": [ 00:13:41.390 "malloc1", 00:13:41.390 "malloc2" 00:13:41.390 ], 00:13:41.390 "superblock": false, 00:13:41.390 "method": "bdev_raid_create", 00:13:41.390 "req_id": 1 00:13:41.390 } 00:13:41.390 Got JSON-RPC error response 00:13:41.390 response: 00:13:41.390 { 00:13:41.390 "code": -17, 00:13:41.390 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:41.390 } 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.390 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:41.648 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:41.648 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:41.648 22:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:41.907 [2024-07-12 22:20:52.044204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:41.907 [2024-07-12 22:20:52.044248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:41.907 [2024-07-12 22:20:52.044271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd47a0 00:13:41.907 [2024-07-12 22:20:52.044283] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:41.907 [2024-07-12 22:20:52.045915] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:41.907 [2024-07-12 22:20:52.045951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:41.907 [2024-07-12 22:20:52.046024] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:41.907 [2024-07-12 22:20:52.046051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:41.907 pt1 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.907 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.165 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.165 "name": "raid_bdev1", 00:13:42.165 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:42.165 "strip_size_kb": 0, 00:13:42.165 "state": "configuring", 00:13:42.165 "raid_level": "raid1", 00:13:42.165 "superblock": true, 00:13:42.165 "num_base_bdevs": 2, 00:13:42.165 "num_base_bdevs_discovered": 1, 00:13:42.165 "num_base_bdevs_operational": 2, 00:13:42.165 "base_bdevs_list": [ 00:13:42.165 { 00:13:42.165 "name": "pt1", 00:13:42.165 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:42.165 "is_configured": true, 00:13:42.165 "data_offset": 2048, 00:13:42.165 "data_size": 63488 00:13:42.165 }, 00:13:42.165 { 00:13:42.165 "name": null, 00:13:42.165 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:42.165 "is_configured": false, 00:13:42.165 "data_offset": 2048, 00:13:42.165 "data_size": 63488 00:13:42.165 } 00:13:42.165 ] 00:13:42.165 }' 00:13:42.165 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.165 22:20:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.729 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:42.729 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:42.729 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:42.729 22:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:42.729 [2024-07-12 22:20:53.034838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:42.729 [2024-07-12 22:20:53.034888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:42.729 [2024-07-12 22:20:53.034907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116c6f0 00:13:42.729 [2024-07-12 22:20:53.034919] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:42.729 [2024-07-12 22:20:53.035281] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:42.729 [2024-07-12 22:20:53.035301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:42.729 [2024-07-12 22:20:53.035364] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:42.729 [2024-07-12 22:20:53.035385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:42.729 [2024-07-12 22:20:53.035488] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116d590 00:13:42.729 [2024-07-12 22:20:53.035499] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:42.729 [2024-07-12 22:20:53.035666] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfce540 00:13:42.729 [2024-07-12 22:20:53.035798] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116d590 00:13:42.729 [2024-07-12 22:20:53.035808] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116d590 00:13:42.729 [2024-07-12 22:20:53.035908] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:42.729 pt2 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.986 "name": "raid_bdev1", 00:13:42.986 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:42.986 "strip_size_kb": 0, 00:13:42.986 "state": "online", 00:13:42.986 "raid_level": "raid1", 00:13:42.986 "superblock": true, 00:13:42.986 "num_base_bdevs": 2, 00:13:42.986 "num_base_bdevs_discovered": 2, 00:13:42.986 "num_base_bdevs_operational": 2, 00:13:42.986 "base_bdevs_list": [ 00:13:42.986 { 00:13:42.986 "name": "pt1", 00:13:42.986 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:42.986 "is_configured": true, 00:13:42.986 "data_offset": 2048, 00:13:42.986 "data_size": 63488 00:13:42.986 }, 00:13:42.986 { 00:13:42.986 "name": "pt2", 00:13:42.986 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:42.986 "is_configured": true, 00:13:42.986 "data_offset": 2048, 00:13:42.986 "data_size": 63488 00:13:42.986 } 00:13:42.986 ] 00:13:42.986 }' 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.986 22:20:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:43.917 22:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:43.917 [2024-07-12 22:20:54.158198] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:43.917 "name": "raid_bdev1", 00:13:43.917 "aliases": [ 00:13:43.917 "8a1e635a-9143-4d13-a8f3-7ef401c0078b" 00:13:43.917 ], 00:13:43.917 "product_name": "Raid Volume", 00:13:43.917 "block_size": 512, 00:13:43.917 "num_blocks": 63488, 00:13:43.917 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:43.917 "assigned_rate_limits": { 00:13:43.917 "rw_ios_per_sec": 0, 00:13:43.917 "rw_mbytes_per_sec": 0, 00:13:43.917 "r_mbytes_per_sec": 0, 00:13:43.917 "w_mbytes_per_sec": 0 00:13:43.917 }, 00:13:43.917 "claimed": false, 00:13:43.917 "zoned": false, 00:13:43.917 "supported_io_types": { 00:13:43.917 "read": true, 00:13:43.917 "write": true, 00:13:43.917 "unmap": false, 00:13:43.917 "flush": false, 00:13:43.917 "reset": true, 00:13:43.917 "nvme_admin": false, 00:13:43.917 "nvme_io": false, 00:13:43.917 "nvme_io_md": false, 00:13:43.917 "write_zeroes": true, 00:13:43.917 "zcopy": false, 00:13:43.917 "get_zone_info": false, 00:13:43.917 "zone_management": false, 00:13:43.917 "zone_append": false, 00:13:43.917 "compare": false, 00:13:43.917 "compare_and_write": false, 00:13:43.917 "abort": false, 00:13:43.917 "seek_hole": false, 00:13:43.917 "seek_data": false, 00:13:43.917 "copy": false, 00:13:43.917 "nvme_iov_md": false 00:13:43.917 }, 00:13:43.917 "memory_domains": [ 00:13:43.917 { 00:13:43.917 "dma_device_id": "system", 00:13:43.917 "dma_device_type": 1 00:13:43.917 }, 00:13:43.917 { 00:13:43.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.917 "dma_device_type": 2 00:13:43.917 }, 00:13:43.917 { 00:13:43.917 "dma_device_id": "system", 00:13:43.917 "dma_device_type": 1 00:13:43.917 }, 00:13:43.917 { 00:13:43.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.917 "dma_device_type": 2 00:13:43.917 } 00:13:43.917 ], 00:13:43.917 "driver_specific": { 00:13:43.917 "raid": { 00:13:43.917 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:43.917 "strip_size_kb": 0, 00:13:43.917 "state": "online", 00:13:43.917 "raid_level": "raid1", 00:13:43.917 "superblock": true, 00:13:43.917 "num_base_bdevs": 2, 00:13:43.917 "num_base_bdevs_discovered": 2, 00:13:43.917 "num_base_bdevs_operational": 2, 00:13:43.917 "base_bdevs_list": [ 00:13:43.917 { 00:13:43.917 "name": "pt1", 00:13:43.917 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:43.917 "is_configured": true, 00:13:43.917 "data_offset": 2048, 00:13:43.917 "data_size": 63488 00:13:43.917 }, 00:13:43.917 { 00:13:43.917 "name": "pt2", 00:13:43.917 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:43.917 "is_configured": true, 00:13:43.917 "data_offset": 2048, 00:13:43.917 "data_size": 63488 00:13:43.917 } 00:13:43.917 ] 00:13:43.917 } 00:13:43.917 } 00:13:43.917 }' 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:43.917 pt2' 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:43.917 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:44.175 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:44.175 "name": "pt1", 00:13:44.175 "aliases": [ 00:13:44.175 "00000000-0000-0000-0000-000000000001" 00:13:44.175 ], 00:13:44.175 "product_name": "passthru", 00:13:44.175 "block_size": 512, 00:13:44.175 "num_blocks": 65536, 00:13:44.175 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:44.175 "assigned_rate_limits": { 00:13:44.175 "rw_ios_per_sec": 0, 00:13:44.175 "rw_mbytes_per_sec": 0, 00:13:44.175 "r_mbytes_per_sec": 0, 00:13:44.175 "w_mbytes_per_sec": 0 00:13:44.175 }, 00:13:44.175 "claimed": true, 00:13:44.175 "claim_type": "exclusive_write", 00:13:44.175 "zoned": false, 00:13:44.175 "supported_io_types": { 00:13:44.175 "read": true, 00:13:44.175 "write": true, 00:13:44.175 "unmap": true, 00:13:44.175 "flush": true, 00:13:44.175 "reset": true, 00:13:44.175 "nvme_admin": false, 00:13:44.175 "nvme_io": false, 00:13:44.175 "nvme_io_md": false, 00:13:44.175 "write_zeroes": true, 00:13:44.175 "zcopy": true, 00:13:44.175 "get_zone_info": false, 00:13:44.175 "zone_management": false, 00:13:44.175 "zone_append": false, 00:13:44.175 "compare": false, 00:13:44.175 "compare_and_write": false, 00:13:44.175 "abort": true, 00:13:44.175 "seek_hole": false, 00:13:44.175 "seek_data": false, 00:13:44.175 "copy": true, 00:13:44.175 "nvme_iov_md": false 00:13:44.175 }, 00:13:44.175 "memory_domains": [ 00:13:44.175 { 00:13:44.175 "dma_device_id": "system", 00:13:44.175 "dma_device_type": 1 00:13:44.176 }, 00:13:44.176 { 00:13:44.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.176 "dma_device_type": 2 00:13:44.176 } 00:13:44.176 ], 00:13:44.176 "driver_specific": { 00:13:44.176 "passthru": { 00:13:44.176 "name": "pt1", 00:13:44.176 "base_bdev_name": "malloc1" 00:13:44.176 } 00:13:44.176 } 00:13:44.176 }' 00:13:44.176 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.176 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.176 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:44.176 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:44.434 22:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:44.999 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:44.999 "name": "pt2", 00:13:44.999 "aliases": [ 00:13:44.999 "00000000-0000-0000-0000-000000000002" 00:13:44.999 ], 00:13:44.999 "product_name": "passthru", 00:13:44.999 "block_size": 512, 00:13:44.999 "num_blocks": 65536, 00:13:44.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:44.999 "assigned_rate_limits": { 00:13:44.999 "rw_ios_per_sec": 0, 00:13:44.999 "rw_mbytes_per_sec": 0, 00:13:44.999 "r_mbytes_per_sec": 0, 00:13:44.999 "w_mbytes_per_sec": 0 00:13:44.999 }, 00:13:44.999 "claimed": true, 00:13:44.999 "claim_type": "exclusive_write", 00:13:44.999 "zoned": false, 00:13:44.999 "supported_io_types": { 00:13:44.999 "read": true, 00:13:44.999 "write": true, 00:13:44.999 "unmap": true, 00:13:44.999 "flush": true, 00:13:44.999 "reset": true, 00:13:44.999 "nvme_admin": false, 00:13:44.999 "nvme_io": false, 00:13:44.999 "nvme_io_md": false, 00:13:44.999 "write_zeroes": true, 00:13:44.999 "zcopy": true, 00:13:44.999 "get_zone_info": false, 00:13:44.999 "zone_management": false, 00:13:44.999 "zone_append": false, 00:13:44.999 "compare": false, 00:13:44.999 "compare_and_write": false, 00:13:44.999 "abort": true, 00:13:44.999 "seek_hole": false, 00:13:44.999 "seek_data": false, 00:13:44.999 "copy": true, 00:13:44.999 "nvme_iov_md": false 00:13:44.999 }, 00:13:44.999 "memory_domains": [ 00:13:44.999 { 00:13:44.999 "dma_device_id": "system", 00:13:44.999 "dma_device_type": 1 00:13:44.999 }, 00:13:44.999 { 00:13:44.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.999 "dma_device_type": 2 00:13:44.999 } 00:13:44.999 ], 00:13:44.999 "driver_specific": { 00:13:44.999 "passthru": { 00:13:44.999 "name": "pt2", 00:13:44.999 "base_bdev_name": "malloc2" 00:13:44.999 } 00:13:44.999 } 00:13:44.999 }' 00:13:44.999 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:44.999 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:45.256 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:45.257 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:45.516 [2024-07-12 22:20:55.814624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8a1e635a-9143-4d13-a8f3-7ef401c0078b '!=' 8a1e635a-9143-4d13-a8f3-7ef401c0078b ']' 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:45.516 22:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:45.775 [2024-07-12 22:20:56.059026] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.775 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:46.034 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.034 "name": "raid_bdev1", 00:13:46.034 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:46.034 "strip_size_kb": 0, 00:13:46.034 "state": "online", 00:13:46.034 "raid_level": "raid1", 00:13:46.034 "superblock": true, 00:13:46.034 "num_base_bdevs": 2, 00:13:46.034 "num_base_bdevs_discovered": 1, 00:13:46.034 "num_base_bdevs_operational": 1, 00:13:46.034 "base_bdevs_list": [ 00:13:46.034 { 00:13:46.034 "name": null, 00:13:46.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.034 "is_configured": false, 00:13:46.035 "data_offset": 2048, 00:13:46.035 "data_size": 63488 00:13:46.035 }, 00:13:46.035 { 00:13:46.035 "name": "pt2", 00:13:46.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:46.035 "is_configured": true, 00:13:46.035 "data_offset": 2048, 00:13:46.035 "data_size": 63488 00:13:46.035 } 00:13:46.035 ] 00:13:46.035 }' 00:13:46.035 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.035 22:20:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.601 22:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:47.169 [2024-07-12 22:20:57.390546] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:47.169 [2024-07-12 22:20:57.390574] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:47.169 [2024-07-12 22:20:57.390632] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:47.169 [2024-07-12 22:20:57.390674] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:47.169 [2024-07-12 22:20:57.390686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116d590 name raid_bdev1, state offline 00:13:47.169 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.169 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:47.428 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:47.428 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:47.428 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:47.428 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:47.428 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:47.687 22:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:47.945 [2024-07-12 22:20:58.136487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:47.945 [2024-07-12 22:20:58.136534] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:47.945 [2024-07-12 22:20:58.136553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd5160 00:13:47.945 [2024-07-12 22:20:58.136566] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:47.945 [2024-07-12 22:20:58.138168] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:47.945 [2024-07-12 22:20:58.138198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:47.945 [2024-07-12 22:20:58.138266] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:47.945 [2024-07-12 22:20:58.138294] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:47.945 [2024-07-12 22:20:58.138380] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfcb380 00:13:47.945 [2024-07-12 22:20:58.138391] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:47.945 [2024-07-12 22:20:58.138564] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfcca80 00:13:47.945 [2024-07-12 22:20:58.138685] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfcb380 00:13:47.945 [2024-07-12 22:20:58.138695] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfcb380 00:13:47.945 [2024-07-12 22:20:58.138791] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.945 pt2 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:47.946 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.204 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.204 "name": "raid_bdev1", 00:13:48.204 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:48.204 "strip_size_kb": 0, 00:13:48.204 "state": "online", 00:13:48.204 "raid_level": "raid1", 00:13:48.204 "superblock": true, 00:13:48.204 "num_base_bdevs": 2, 00:13:48.204 "num_base_bdevs_discovered": 1, 00:13:48.204 "num_base_bdevs_operational": 1, 00:13:48.204 "base_bdevs_list": [ 00:13:48.204 { 00:13:48.204 "name": null, 00:13:48.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.204 "is_configured": false, 00:13:48.204 "data_offset": 2048, 00:13:48.204 "data_size": 63488 00:13:48.204 }, 00:13:48.204 { 00:13:48.204 "name": "pt2", 00:13:48.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:48.204 "is_configured": true, 00:13:48.204 "data_offset": 2048, 00:13:48.204 "data_size": 63488 00:13:48.204 } 00:13:48.204 ] 00:13:48.204 }' 00:13:48.204 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.204 22:20:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.772 22:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:49.032 [2024-07-12 22:20:59.175229] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:49.032 [2024-07-12 22:20:59.175254] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:49.032 [2024-07-12 22:20:59.175309] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:49.032 [2024-07-12 22:20:59.175353] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:49.032 [2024-07-12 22:20:59.175364] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfcb380 name raid_bdev1, state offline 00:13:49.032 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:49.032 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.291 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:49.291 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:49.291 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:49.291 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:49.550 [2024-07-12 22:20:59.668662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:49.550 [2024-07-12 22:20:59.668712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:49.550 [2024-07-12 22:20:59.668731] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1177520 00:13:49.550 [2024-07-12 22:20:59.668744] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:49.550 [2024-07-12 22:20:59.670345] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:49.550 [2024-07-12 22:20:59.670376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:49.550 [2024-07-12 22:20:59.670442] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:49.550 [2024-07-12 22:20:59.670468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:49.550 [2024-07-12 22:20:59.670568] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:49.550 [2024-07-12 22:20:59.670581] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:49.550 [2024-07-12 22:20:59.670594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfcc3f0 name raid_bdev1, state configuring 00:13:49.550 [2024-07-12 22:20:59.670617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:49.550 [2024-07-12 22:20:59.670678] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfce2b0 00:13:49.550 [2024-07-12 22:20:59.670689] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:49.550 [2024-07-12 22:20:59.670852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfcb350 00:13:49.550 [2024-07-12 22:20:59.670984] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfce2b0 00:13:49.550 [2024-07-12 22:20:59.670994] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfce2b0 00:13:49.550 [2024-07-12 22:20:59.671093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:49.550 pt1 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.550 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:49.809 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.809 "name": "raid_bdev1", 00:13:49.809 "uuid": "8a1e635a-9143-4d13-a8f3-7ef401c0078b", 00:13:49.809 "strip_size_kb": 0, 00:13:49.809 "state": "online", 00:13:49.809 "raid_level": "raid1", 00:13:49.809 "superblock": true, 00:13:49.809 "num_base_bdevs": 2, 00:13:49.809 "num_base_bdevs_discovered": 1, 00:13:49.809 "num_base_bdevs_operational": 1, 00:13:49.809 "base_bdevs_list": [ 00:13:49.809 { 00:13:49.809 "name": null, 00:13:49.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.809 "is_configured": false, 00:13:49.809 "data_offset": 2048, 00:13:49.809 "data_size": 63488 00:13:49.809 }, 00:13:49.809 { 00:13:49.809 "name": "pt2", 00:13:49.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:49.809 "is_configured": true, 00:13:49.809 "data_offset": 2048, 00:13:49.809 "data_size": 63488 00:13:49.809 } 00:13:49.809 ] 00:13:49.809 }' 00:13:49.809 22:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.809 22:20:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.067 22:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:50.067 22:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:50.326 22:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:50.326 22:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:50.326 22:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:50.891 [2024-07-12 22:21:01.096668] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 8a1e635a-9143-4d13-a8f3-7ef401c0078b '!=' 8a1e635a-9143-4d13-a8f3-7ef401c0078b ']' 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3437219 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3437219 ']' 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3437219 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3437219 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3437219' 00:13:50.891 killing process with pid 3437219 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3437219 00:13:50.891 [2024-07-12 22:21:01.180509] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:50.891 [2024-07-12 22:21:01.180565] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.891 [2024-07-12 22:21:01.180615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:50.891 [2024-07-12 22:21:01.180628] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfce2b0 name raid_bdev1, state offline 00:13:50.891 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3437219 00:13:50.891 [2024-07-12 22:21:01.196907] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:51.149 22:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:51.149 00:13:51.149 real 0m16.027s 00:13:51.149 user 0m29.168s 00:13:51.149 sys 0m2.856s 00:13:51.149 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:51.149 22:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.149 ************************************ 00:13:51.149 END TEST raid_superblock_test 00:13:51.149 ************************************ 00:13:51.149 22:21:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:51.149 22:21:01 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:51.149 22:21:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:51.149 22:21:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:51.149 22:21:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:51.409 ************************************ 00:13:51.409 START TEST raid_read_error_test 00:13:51.409 ************************************ 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.C6jUG50I9F 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3439748 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3439748 /var/tmp/spdk-raid.sock 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3439748 ']' 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:51.409 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:51.409 22:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.409 [2024-07-12 22:21:01.552476] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:51.409 [2024-07-12 22:21:01.552544] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3439748 ] 00:13:51.409 [2024-07-12 22:21:01.682371] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:51.667 [2024-07-12 22:21:01.788696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.667 [2024-07-12 22:21:01.852985] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.667 [2024-07-12 22:21:01.853025] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.234 22:21:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:52.234 22:21:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:52.234 22:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:52.234 22:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:52.493 BaseBdev1_malloc 00:13:52.493 22:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:52.752 true 00:13:52.752 22:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:53.010 [2024-07-12 22:21:03.136153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:53.010 [2024-07-12 22:21:03.136199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.010 [2024-07-12 22:21:03.136220] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbdf0d0 00:13:53.010 [2024-07-12 22:21:03.136233] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.010 [2024-07-12 22:21:03.138145] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.010 [2024-07-12 22:21:03.138178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:53.010 BaseBdev1 00:13:53.010 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:53.010 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:53.010 BaseBdev2_malloc 00:13:53.010 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:53.269 true 00:13:53.269 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:53.528 [2024-07-12 22:21:03.695496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:53.528 [2024-07-12 22:21:03.695542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.528 [2024-07-12 22:21:03.695563] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe3910 00:13:53.528 [2024-07-12 22:21:03.695583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.528 [2024-07-12 22:21:03.697182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.528 [2024-07-12 22:21:03.697214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:53.528 BaseBdev2 00:13:53.528 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:53.787 [2024-07-12 22:21:03.924124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.787 [2024-07-12 22:21:03.925517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:53.787 [2024-07-12 22:21:03.925721] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe5320 00:13:53.787 [2024-07-12 22:21:03.925735] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:53.787 [2024-07-12 22:21:03.925940] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4cd00 00:13:53.787 [2024-07-12 22:21:03.926093] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe5320 00:13:53.787 [2024-07-12 22:21:03.926104] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbe5320 00:13:53.787 [2024-07-12 22:21:03.926213] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.787 22:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:54.354 22:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.354 "name": "raid_bdev1", 00:13:54.354 "uuid": "8c653e2a-ced3-4f0d-862d-ed4126266f12", 00:13:54.354 "strip_size_kb": 0, 00:13:54.354 "state": "online", 00:13:54.354 "raid_level": "raid1", 00:13:54.354 "superblock": true, 00:13:54.354 "num_base_bdevs": 2, 00:13:54.354 "num_base_bdevs_discovered": 2, 00:13:54.354 "num_base_bdevs_operational": 2, 00:13:54.354 "base_bdevs_list": [ 00:13:54.354 { 00:13:54.354 "name": "BaseBdev1", 00:13:54.354 "uuid": "b21f3947-1667-5b58-993a-cdcc4b175824", 00:13:54.354 "is_configured": true, 00:13:54.354 "data_offset": 2048, 00:13:54.354 "data_size": 63488 00:13:54.354 }, 00:13:54.354 { 00:13:54.354 "name": "BaseBdev2", 00:13:54.354 "uuid": "ea008946-48ac-59e5-bb3c-03230e711909", 00:13:54.354 "is_configured": true, 00:13:54.354 "data_offset": 2048, 00:13:54.354 "data_size": 63488 00:13:54.354 } 00:13:54.354 ] 00:13:54.354 }' 00:13:54.354 22:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.354 22:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.924 22:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:54.924 22:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:54.924 [2024-07-12 22:21:05.163675] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe0c70 00:13:55.942 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:56.202 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.462 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.462 "name": "raid_bdev1", 00:13:56.462 "uuid": "8c653e2a-ced3-4f0d-862d-ed4126266f12", 00:13:56.462 "strip_size_kb": 0, 00:13:56.462 "state": "online", 00:13:56.462 "raid_level": "raid1", 00:13:56.462 "superblock": true, 00:13:56.462 "num_base_bdevs": 2, 00:13:56.462 "num_base_bdevs_discovered": 2, 00:13:56.462 "num_base_bdevs_operational": 2, 00:13:56.462 "base_bdevs_list": [ 00:13:56.462 { 00:13:56.462 "name": "BaseBdev1", 00:13:56.462 "uuid": "b21f3947-1667-5b58-993a-cdcc4b175824", 00:13:56.462 "is_configured": true, 00:13:56.462 "data_offset": 2048, 00:13:56.462 "data_size": 63488 00:13:56.462 }, 00:13:56.462 { 00:13:56.462 "name": "BaseBdev2", 00:13:56.462 "uuid": "ea008946-48ac-59e5-bb3c-03230e711909", 00:13:56.462 "is_configured": true, 00:13:56.462 "data_offset": 2048, 00:13:56.462 "data_size": 63488 00:13:56.462 } 00:13:56.462 ] 00:13:56.462 }' 00:13:56.462 22:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.462 22:21:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.030 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:57.290 [2024-07-12 22:21:07.377832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:57.290 [2024-07-12 22:21:07.377874] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:57.290 [2024-07-12 22:21:07.381006] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:57.290 [2024-07-12 22:21:07.381037] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:57.290 [2024-07-12 22:21:07.381118] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:57.290 [2024-07-12 22:21:07.381130] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe5320 name raid_bdev1, state offline 00:13:57.290 0 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3439748 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3439748 ']' 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3439748 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3439748 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3439748' 00:13:57.290 killing process with pid 3439748 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3439748 00:13:57.290 [2024-07-12 22:21:07.443318] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:57.290 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3439748 00:13:57.290 [2024-07-12 22:21:07.453998] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.C6jUG50I9F 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:57.549 00:13:57.549 real 0m6.209s 00:13:57.549 user 0m9.705s 00:13:57.549 sys 0m1.069s 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:57.549 22:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.549 ************************************ 00:13:57.549 END TEST raid_read_error_test 00:13:57.549 ************************************ 00:13:57.549 22:21:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:57.549 22:21:07 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:57.549 22:21:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:57.549 22:21:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:57.549 22:21:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:57.549 ************************************ 00:13:57.549 START TEST raid_write_error_test 00:13:57.549 ************************************ 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:57.549 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FIdaKFwfPK 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3441030 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3441030 /var/tmp/spdk-raid.sock 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3441030 ']' 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:57.550 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.550 22:21:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:57.550 [2024-07-12 22:21:07.839570] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:13:57.550 [2024-07-12 22:21:07.839637] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3441030 ] 00:13:57.809 [2024-07-12 22:21:07.961294] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:57.809 [2024-07-12 22:21:08.060846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:57.809 [2024-07-12 22:21:08.122751] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:57.809 [2024-07-12 22:21:08.122786] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:58.376 22:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:58.376 22:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:58.376 22:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:58.376 22:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:58.635 BaseBdev1_malloc 00:13:58.635 22:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:58.895 true 00:13:58.895 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:58.895 [2024-07-12 22:21:09.168619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:58.895 [2024-07-12 22:21:09.168664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:58.895 [2024-07-12 22:21:09.168685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf380d0 00:13:58.895 [2024-07-12 22:21:09.168698] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:58.895 [2024-07-12 22:21:09.170575] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:58.895 [2024-07-12 22:21:09.170605] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:58.895 BaseBdev1 00:13:58.895 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:58.895 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:59.154 BaseBdev2_malloc 00:13:59.154 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:59.413 true 00:13:59.413 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:59.413 [2024-07-12 22:21:09.723996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:59.413 [2024-07-12 22:21:09.724039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:59.413 [2024-07-12 22:21:09.724061] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3c910 00:13:59.413 [2024-07-12 22:21:09.724074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:59.413 [2024-07-12 22:21:09.725660] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:59.413 [2024-07-12 22:21:09.725689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:59.413 BaseBdev2 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:59.672 [2024-07-12 22:21:09.888460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:59.672 [2024-07-12 22:21:09.889793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:59.672 [2024-07-12 22:21:09.889992] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3e320 00:13:59.672 [2024-07-12 22:21:09.890006] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:59.672 [2024-07-12 22:21:09.890198] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda5d00 00:13:59.672 [2024-07-12 22:21:09.890353] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3e320 00:13:59.672 [2024-07-12 22:21:09.890364] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3e320 00:13:59.672 [2024-07-12 22:21:09.890473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.672 22:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:59.932 22:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.932 "name": "raid_bdev1", 00:13:59.932 "uuid": "0f1faaa6-7583-451c-984b-a5c7001c847f", 00:13:59.932 "strip_size_kb": 0, 00:13:59.932 "state": "online", 00:13:59.932 "raid_level": "raid1", 00:13:59.932 "superblock": true, 00:13:59.932 "num_base_bdevs": 2, 00:13:59.932 "num_base_bdevs_discovered": 2, 00:13:59.932 "num_base_bdevs_operational": 2, 00:13:59.932 "base_bdevs_list": [ 00:13:59.932 { 00:13:59.932 "name": "BaseBdev1", 00:13:59.932 "uuid": "38a2ff7d-0f69-5efc-9ed8-8d46cbcc247f", 00:13:59.932 "is_configured": true, 00:13:59.932 "data_offset": 2048, 00:13:59.932 "data_size": 63488 00:13:59.932 }, 00:13:59.932 { 00:13:59.932 "name": "BaseBdev2", 00:13:59.932 "uuid": "0755cc28-394d-5c13-8377-818325f736d2", 00:13:59.932 "is_configured": true, 00:13:59.932 "data_offset": 2048, 00:13:59.932 "data_size": 63488 00:13:59.932 } 00:13:59.932 ] 00:13:59.932 }' 00:13:59.932 22:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.932 22:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.500 22:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:00.500 22:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:00.760 [2024-07-12 22:21:10.847307] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf39c70 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:01.697 [2024-07-12 22:21:11.971147] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:01.697 [2024-07-12 22:21:11.971202] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:01.697 [2024-07-12 22:21:11.971378] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf39c70 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.697 22:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:01.957 22:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.957 "name": "raid_bdev1", 00:14:01.957 "uuid": "0f1faaa6-7583-451c-984b-a5c7001c847f", 00:14:01.957 "strip_size_kb": 0, 00:14:01.957 "state": "online", 00:14:01.957 "raid_level": "raid1", 00:14:01.957 "superblock": true, 00:14:01.957 "num_base_bdevs": 2, 00:14:01.957 "num_base_bdevs_discovered": 1, 00:14:01.957 "num_base_bdevs_operational": 1, 00:14:01.957 "base_bdevs_list": [ 00:14:01.957 { 00:14:01.957 "name": null, 00:14:01.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.957 "is_configured": false, 00:14:01.957 "data_offset": 2048, 00:14:01.957 "data_size": 63488 00:14:01.957 }, 00:14:01.957 { 00:14:01.957 "name": "BaseBdev2", 00:14:01.957 "uuid": "0755cc28-394d-5c13-8377-818325f736d2", 00:14:01.957 "is_configured": true, 00:14:01.957 "data_offset": 2048, 00:14:01.957 "data_size": 63488 00:14:01.957 } 00:14:01.957 ] 00:14:01.957 }' 00:14:01.957 22:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.957 22:21:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.525 22:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:02.785 [2024-07-12 22:21:12.978209] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:02.785 [2024-07-12 22:21:12.978243] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:02.785 [2024-07-12 22:21:12.981375] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:02.785 [2024-07-12 22:21:12.981404] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.785 [2024-07-12 22:21:12.981455] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:02.785 [2024-07-12 22:21:12.981466] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3e320 name raid_bdev1, state offline 00:14:02.785 0 00:14:02.785 22:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3441030 00:14:02.785 22:21:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3441030 ']' 00:14:02.785 22:21:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3441030 00:14:02.785 22:21:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3441030 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3441030' 00:14:02.785 killing process with pid 3441030 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3441030 00:14:02.785 [2024-07-12 22:21:13.045719] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:02.785 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3441030 00:14:02.785 [2024-07-12 22:21:13.056290] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FIdaKFwfPK 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:03.043 00:14:03.043 real 0m5.524s 00:14:03.043 user 0m8.548s 00:14:03.043 sys 0m0.915s 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:03.043 22:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.043 ************************************ 00:14:03.043 END TEST raid_write_error_test 00:14:03.043 ************************************ 00:14:03.043 22:21:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:03.043 22:21:13 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:03.043 22:21:13 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:03.043 22:21:13 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:14:03.043 22:21:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:03.043 22:21:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:03.043 22:21:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:03.302 ************************************ 00:14:03.302 START TEST raid_state_function_test 00:14:03.302 ************************************ 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:03.302 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3441936 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3441936' 00:14:03.303 Process raid pid: 3441936 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3441936 /var/tmp/spdk-raid.sock 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3441936 ']' 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:03.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:03.303 22:21:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.303 [2024-07-12 22:21:13.453088] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:14:03.303 [2024-07-12 22:21:13.453160] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:03.303 [2024-07-12 22:21:13.584350] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:03.563 [2024-07-12 22:21:13.686586] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:03.563 [2024-07-12 22:21:13.747129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:03.563 [2024-07-12 22:21:13.747161] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:04.132 22:21:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:04.132 22:21:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:04.132 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:04.391 [2024-07-12 22:21:14.485811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:04.391 [2024-07-12 22:21:14.485854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:04.391 [2024-07-12 22:21:14.485865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:04.391 [2024-07-12 22:21:14.485877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:04.391 [2024-07-12 22:21:14.485886] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:04.391 [2024-07-12 22:21:14.485897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.391 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.650 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.650 "name": "Existed_Raid", 00:14:04.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.650 "strip_size_kb": 64, 00:14:04.650 "state": "configuring", 00:14:04.650 "raid_level": "raid0", 00:14:04.650 "superblock": false, 00:14:04.650 "num_base_bdevs": 3, 00:14:04.650 "num_base_bdevs_discovered": 0, 00:14:04.650 "num_base_bdevs_operational": 3, 00:14:04.650 "base_bdevs_list": [ 00:14:04.650 { 00:14:04.650 "name": "BaseBdev1", 00:14:04.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.650 "is_configured": false, 00:14:04.650 "data_offset": 0, 00:14:04.650 "data_size": 0 00:14:04.650 }, 00:14:04.650 { 00:14:04.650 "name": "BaseBdev2", 00:14:04.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.650 "is_configured": false, 00:14:04.650 "data_offset": 0, 00:14:04.650 "data_size": 0 00:14:04.650 }, 00:14:04.650 { 00:14:04.650 "name": "BaseBdev3", 00:14:04.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.650 "is_configured": false, 00:14:04.650 "data_offset": 0, 00:14:04.650 "data_size": 0 00:14:04.650 } 00:14:04.650 ] 00:14:04.650 }' 00:14:04.650 22:21:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.650 22:21:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.218 22:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:05.477 [2024-07-12 22:21:15.572542] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:05.477 [2024-07-12 22:21:15.572579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e6a80 name Existed_Raid, state configuring 00:14:05.477 22:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:05.477 [2024-07-12 22:21:15.757058] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:05.477 [2024-07-12 22:21:15.757095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:05.477 [2024-07-12 22:21:15.757105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:05.477 [2024-07-12 22:21:15.757117] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:05.477 [2024-07-12 22:21:15.757126] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:05.477 [2024-07-12 22:21:15.757137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:05.477 22:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:05.736 [2024-07-12 22:21:15.955457] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:05.736 BaseBdev1 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:05.736 22:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:05.995 22:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:06.254 [ 00:14:06.254 { 00:14:06.254 "name": "BaseBdev1", 00:14:06.254 "aliases": [ 00:14:06.254 "13b91221-c6d3-4382-abc3-f4821a5d0573" 00:14:06.254 ], 00:14:06.254 "product_name": "Malloc disk", 00:14:06.254 "block_size": 512, 00:14:06.254 "num_blocks": 65536, 00:14:06.254 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:06.254 "assigned_rate_limits": { 00:14:06.254 "rw_ios_per_sec": 0, 00:14:06.254 "rw_mbytes_per_sec": 0, 00:14:06.254 "r_mbytes_per_sec": 0, 00:14:06.254 "w_mbytes_per_sec": 0 00:14:06.254 }, 00:14:06.254 "claimed": true, 00:14:06.254 "claim_type": "exclusive_write", 00:14:06.254 "zoned": false, 00:14:06.254 "supported_io_types": { 00:14:06.254 "read": true, 00:14:06.254 "write": true, 00:14:06.254 "unmap": true, 00:14:06.254 "flush": true, 00:14:06.254 "reset": true, 00:14:06.254 "nvme_admin": false, 00:14:06.254 "nvme_io": false, 00:14:06.254 "nvme_io_md": false, 00:14:06.254 "write_zeroes": true, 00:14:06.254 "zcopy": true, 00:14:06.254 "get_zone_info": false, 00:14:06.254 "zone_management": false, 00:14:06.254 "zone_append": false, 00:14:06.254 "compare": false, 00:14:06.254 "compare_and_write": false, 00:14:06.254 "abort": true, 00:14:06.254 "seek_hole": false, 00:14:06.254 "seek_data": false, 00:14:06.254 "copy": true, 00:14:06.254 "nvme_iov_md": false 00:14:06.254 }, 00:14:06.254 "memory_domains": [ 00:14:06.254 { 00:14:06.254 "dma_device_id": "system", 00:14:06.254 "dma_device_type": 1 00:14:06.254 }, 00:14:06.254 { 00:14:06.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.254 "dma_device_type": 2 00:14:06.254 } 00:14:06.254 ], 00:14:06.254 "driver_specific": {} 00:14:06.254 } 00:14:06.254 ] 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.254 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.513 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.513 "name": "Existed_Raid", 00:14:06.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.513 "strip_size_kb": 64, 00:14:06.513 "state": "configuring", 00:14:06.513 "raid_level": "raid0", 00:14:06.513 "superblock": false, 00:14:06.513 "num_base_bdevs": 3, 00:14:06.513 "num_base_bdevs_discovered": 1, 00:14:06.513 "num_base_bdevs_operational": 3, 00:14:06.513 "base_bdevs_list": [ 00:14:06.513 { 00:14:06.513 "name": "BaseBdev1", 00:14:06.513 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:06.513 "is_configured": true, 00:14:06.513 "data_offset": 0, 00:14:06.513 "data_size": 65536 00:14:06.513 }, 00:14:06.513 { 00:14:06.513 "name": "BaseBdev2", 00:14:06.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.513 "is_configured": false, 00:14:06.513 "data_offset": 0, 00:14:06.513 "data_size": 0 00:14:06.513 }, 00:14:06.513 { 00:14:06.513 "name": "BaseBdev3", 00:14:06.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.513 "is_configured": false, 00:14:06.513 "data_offset": 0, 00:14:06.513 "data_size": 0 00:14:06.513 } 00:14:06.513 ] 00:14:06.513 }' 00:14:06.513 22:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.513 22:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.090 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:07.359 [2024-07-12 22:21:17.451447] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:07.359 [2024-07-12 22:21:17.451496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e6310 name Existed_Raid, state configuring 00:14:07.359 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:07.617 [2024-07-12 22:21:17.696128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:07.617 [2024-07-12 22:21:17.697590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:07.617 [2024-07-12 22:21:17.697622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:07.617 [2024-07-12 22:21:17.697633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:07.617 [2024-07-12 22:21:17.697645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.617 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.875 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.875 "name": "Existed_Raid", 00:14:07.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.875 "strip_size_kb": 64, 00:14:07.875 "state": "configuring", 00:14:07.875 "raid_level": "raid0", 00:14:07.875 "superblock": false, 00:14:07.875 "num_base_bdevs": 3, 00:14:07.875 "num_base_bdevs_discovered": 1, 00:14:07.875 "num_base_bdevs_operational": 3, 00:14:07.875 "base_bdevs_list": [ 00:14:07.875 { 00:14:07.875 "name": "BaseBdev1", 00:14:07.875 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:07.875 "is_configured": true, 00:14:07.875 "data_offset": 0, 00:14:07.875 "data_size": 65536 00:14:07.875 }, 00:14:07.875 { 00:14:07.875 "name": "BaseBdev2", 00:14:07.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.875 "is_configured": false, 00:14:07.875 "data_offset": 0, 00:14:07.875 "data_size": 0 00:14:07.875 }, 00:14:07.875 { 00:14:07.875 "name": "BaseBdev3", 00:14:07.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.875 "is_configured": false, 00:14:07.875 "data_offset": 0, 00:14:07.875 "data_size": 0 00:14:07.875 } 00:14:07.875 ] 00:14:07.875 }' 00:14:07.875 22:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.875 22:21:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.442 22:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:08.701 [2024-07-12 22:21:18.786414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:08.701 BaseBdev2 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:08.701 22:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.959 22:21:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:08.959 [ 00:14:08.959 { 00:14:08.959 "name": "BaseBdev2", 00:14:08.959 "aliases": [ 00:14:08.959 "2fb7c362-1d69-4ace-8a78-7512f9c5481e" 00:14:08.959 ], 00:14:08.959 "product_name": "Malloc disk", 00:14:08.959 "block_size": 512, 00:14:08.959 "num_blocks": 65536, 00:14:08.959 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:08.959 "assigned_rate_limits": { 00:14:08.959 "rw_ios_per_sec": 0, 00:14:08.959 "rw_mbytes_per_sec": 0, 00:14:08.959 "r_mbytes_per_sec": 0, 00:14:08.959 "w_mbytes_per_sec": 0 00:14:08.959 }, 00:14:08.959 "claimed": true, 00:14:08.959 "claim_type": "exclusive_write", 00:14:08.959 "zoned": false, 00:14:08.959 "supported_io_types": { 00:14:08.959 "read": true, 00:14:08.959 "write": true, 00:14:08.959 "unmap": true, 00:14:08.959 "flush": true, 00:14:08.959 "reset": true, 00:14:08.959 "nvme_admin": false, 00:14:08.959 "nvme_io": false, 00:14:08.959 "nvme_io_md": false, 00:14:08.959 "write_zeroes": true, 00:14:08.959 "zcopy": true, 00:14:08.959 "get_zone_info": false, 00:14:08.959 "zone_management": false, 00:14:08.959 "zone_append": false, 00:14:08.959 "compare": false, 00:14:08.959 "compare_and_write": false, 00:14:08.959 "abort": true, 00:14:08.959 "seek_hole": false, 00:14:08.959 "seek_data": false, 00:14:08.959 "copy": true, 00:14:08.959 "nvme_iov_md": false 00:14:08.959 }, 00:14:08.959 "memory_domains": [ 00:14:08.959 { 00:14:08.959 "dma_device_id": "system", 00:14:08.959 "dma_device_type": 1 00:14:08.959 }, 00:14:08.959 { 00:14:08.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.959 "dma_device_type": 2 00:14:08.959 } 00:14:08.959 ], 00:14:08.959 "driver_specific": {} 00:14:08.959 } 00:14:08.959 ] 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.217 "name": "Existed_Raid", 00:14:09.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.217 "strip_size_kb": 64, 00:14:09.217 "state": "configuring", 00:14:09.217 "raid_level": "raid0", 00:14:09.217 "superblock": false, 00:14:09.217 "num_base_bdevs": 3, 00:14:09.217 "num_base_bdevs_discovered": 2, 00:14:09.217 "num_base_bdevs_operational": 3, 00:14:09.217 "base_bdevs_list": [ 00:14:09.217 { 00:14:09.217 "name": "BaseBdev1", 00:14:09.217 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:09.217 "is_configured": true, 00:14:09.217 "data_offset": 0, 00:14:09.217 "data_size": 65536 00:14:09.217 }, 00:14:09.217 { 00:14:09.217 "name": "BaseBdev2", 00:14:09.217 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:09.217 "is_configured": true, 00:14:09.217 "data_offset": 0, 00:14:09.217 "data_size": 65536 00:14:09.217 }, 00:14:09.217 { 00:14:09.217 "name": "BaseBdev3", 00:14:09.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.217 "is_configured": false, 00:14:09.217 "data_offset": 0, 00:14:09.217 "data_size": 0 00:14:09.217 } 00:14:09.217 ] 00:14:09.217 }' 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.217 22:21:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:10.153 [2024-07-12 22:21:20.382065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:10.153 [2024-07-12 22:21:20.382102] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e7400 00:14:10.153 [2024-07-12 22:21:20.382111] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:10.153 [2024-07-12 22:21:20.382361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e6ef0 00:14:10.153 [2024-07-12 22:21:20.382482] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e7400 00:14:10.153 [2024-07-12 22:21:20.382492] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18e7400 00:14:10.153 [2024-07-12 22:21:20.382665] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.153 BaseBdev3 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.153 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:10.411 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:10.411 [ 00:14:10.411 { 00:14:10.411 "name": "BaseBdev3", 00:14:10.411 "aliases": [ 00:14:10.411 "25cc196a-8330-4f6e-8ead-dd994d31f54c" 00:14:10.411 ], 00:14:10.411 "product_name": "Malloc disk", 00:14:10.411 "block_size": 512, 00:14:10.411 "num_blocks": 65536, 00:14:10.411 "uuid": "25cc196a-8330-4f6e-8ead-dd994d31f54c", 00:14:10.411 "assigned_rate_limits": { 00:14:10.411 "rw_ios_per_sec": 0, 00:14:10.411 "rw_mbytes_per_sec": 0, 00:14:10.411 "r_mbytes_per_sec": 0, 00:14:10.411 "w_mbytes_per_sec": 0 00:14:10.411 }, 00:14:10.411 "claimed": true, 00:14:10.411 "claim_type": "exclusive_write", 00:14:10.411 "zoned": false, 00:14:10.411 "supported_io_types": { 00:14:10.411 "read": true, 00:14:10.411 "write": true, 00:14:10.411 "unmap": true, 00:14:10.411 "flush": true, 00:14:10.411 "reset": true, 00:14:10.411 "nvme_admin": false, 00:14:10.411 "nvme_io": false, 00:14:10.411 "nvme_io_md": false, 00:14:10.411 "write_zeroes": true, 00:14:10.411 "zcopy": true, 00:14:10.411 "get_zone_info": false, 00:14:10.411 "zone_management": false, 00:14:10.411 "zone_append": false, 00:14:10.411 "compare": false, 00:14:10.411 "compare_and_write": false, 00:14:10.411 "abort": true, 00:14:10.411 "seek_hole": false, 00:14:10.411 "seek_data": false, 00:14:10.411 "copy": true, 00:14:10.411 "nvme_iov_md": false 00:14:10.411 }, 00:14:10.411 "memory_domains": [ 00:14:10.411 { 00:14:10.411 "dma_device_id": "system", 00:14:10.412 "dma_device_type": 1 00:14:10.412 }, 00:14:10.412 { 00:14:10.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.412 "dma_device_type": 2 00:14:10.412 } 00:14:10.412 ], 00:14:10.412 "driver_specific": {} 00:14:10.412 } 00:14:10.412 ] 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.670 "name": "Existed_Raid", 00:14:10.670 "uuid": "9ecb4322-43e3-4cb3-90d3-2ae3dd5f629d", 00:14:10.670 "strip_size_kb": 64, 00:14:10.670 "state": "online", 00:14:10.670 "raid_level": "raid0", 00:14:10.670 "superblock": false, 00:14:10.670 "num_base_bdevs": 3, 00:14:10.670 "num_base_bdevs_discovered": 3, 00:14:10.670 "num_base_bdevs_operational": 3, 00:14:10.670 "base_bdevs_list": [ 00:14:10.670 { 00:14:10.670 "name": "BaseBdev1", 00:14:10.670 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:10.670 "is_configured": true, 00:14:10.670 "data_offset": 0, 00:14:10.670 "data_size": 65536 00:14:10.670 }, 00:14:10.670 { 00:14:10.670 "name": "BaseBdev2", 00:14:10.670 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:10.670 "is_configured": true, 00:14:10.670 "data_offset": 0, 00:14:10.670 "data_size": 65536 00:14:10.670 }, 00:14:10.670 { 00:14:10.670 "name": "BaseBdev3", 00:14:10.670 "uuid": "25cc196a-8330-4f6e-8ead-dd994d31f54c", 00:14:10.670 "is_configured": true, 00:14:10.670 "data_offset": 0, 00:14:10.670 "data_size": 65536 00:14:10.670 } 00:14:10.670 ] 00:14:10.670 }' 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.670 22:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:11.237 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.495 [2024-07-12 22:21:21.770082] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.495 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.495 "name": "Existed_Raid", 00:14:11.495 "aliases": [ 00:14:11.495 "9ecb4322-43e3-4cb3-90d3-2ae3dd5f629d" 00:14:11.495 ], 00:14:11.495 "product_name": "Raid Volume", 00:14:11.495 "block_size": 512, 00:14:11.495 "num_blocks": 196608, 00:14:11.495 "uuid": "9ecb4322-43e3-4cb3-90d3-2ae3dd5f629d", 00:14:11.495 "assigned_rate_limits": { 00:14:11.495 "rw_ios_per_sec": 0, 00:14:11.495 "rw_mbytes_per_sec": 0, 00:14:11.495 "r_mbytes_per_sec": 0, 00:14:11.495 "w_mbytes_per_sec": 0 00:14:11.495 }, 00:14:11.495 "claimed": false, 00:14:11.495 "zoned": false, 00:14:11.495 "supported_io_types": { 00:14:11.495 "read": true, 00:14:11.495 "write": true, 00:14:11.495 "unmap": true, 00:14:11.495 "flush": true, 00:14:11.495 "reset": true, 00:14:11.495 "nvme_admin": false, 00:14:11.495 "nvme_io": false, 00:14:11.495 "nvme_io_md": false, 00:14:11.495 "write_zeroes": true, 00:14:11.495 "zcopy": false, 00:14:11.495 "get_zone_info": false, 00:14:11.495 "zone_management": false, 00:14:11.495 "zone_append": false, 00:14:11.495 "compare": false, 00:14:11.495 "compare_and_write": false, 00:14:11.495 "abort": false, 00:14:11.495 "seek_hole": false, 00:14:11.495 "seek_data": false, 00:14:11.495 "copy": false, 00:14:11.495 "nvme_iov_md": false 00:14:11.495 }, 00:14:11.495 "memory_domains": [ 00:14:11.495 { 00:14:11.495 "dma_device_id": "system", 00:14:11.495 "dma_device_type": 1 00:14:11.495 }, 00:14:11.495 { 00:14:11.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.495 "dma_device_type": 2 00:14:11.495 }, 00:14:11.495 { 00:14:11.495 "dma_device_id": "system", 00:14:11.495 "dma_device_type": 1 00:14:11.495 }, 00:14:11.495 { 00:14:11.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.495 "dma_device_type": 2 00:14:11.495 }, 00:14:11.495 { 00:14:11.495 "dma_device_id": "system", 00:14:11.496 "dma_device_type": 1 00:14:11.496 }, 00:14:11.496 { 00:14:11.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.496 "dma_device_type": 2 00:14:11.496 } 00:14:11.496 ], 00:14:11.496 "driver_specific": { 00:14:11.496 "raid": { 00:14:11.496 "uuid": "9ecb4322-43e3-4cb3-90d3-2ae3dd5f629d", 00:14:11.496 "strip_size_kb": 64, 00:14:11.496 "state": "online", 00:14:11.496 "raid_level": "raid0", 00:14:11.496 "superblock": false, 00:14:11.496 "num_base_bdevs": 3, 00:14:11.496 "num_base_bdevs_discovered": 3, 00:14:11.496 "num_base_bdevs_operational": 3, 00:14:11.496 "base_bdevs_list": [ 00:14:11.496 { 00:14:11.496 "name": "BaseBdev1", 00:14:11.496 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:11.496 "is_configured": true, 00:14:11.496 "data_offset": 0, 00:14:11.496 "data_size": 65536 00:14:11.496 }, 00:14:11.496 { 00:14:11.496 "name": "BaseBdev2", 00:14:11.496 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:11.496 "is_configured": true, 00:14:11.496 "data_offset": 0, 00:14:11.496 "data_size": 65536 00:14:11.496 }, 00:14:11.496 { 00:14:11.496 "name": "BaseBdev3", 00:14:11.496 "uuid": "25cc196a-8330-4f6e-8ead-dd994d31f54c", 00:14:11.496 "is_configured": true, 00:14:11.496 "data_offset": 0, 00:14:11.496 "data_size": 65536 00:14:11.496 } 00:14:11.496 ] 00:14:11.496 } 00:14:11.496 } 00:14:11.496 }' 00:14:11.496 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.755 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:11.755 BaseBdev2 00:14:11.755 BaseBdev3' 00:14:11.755 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.755 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:11.755 22:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.013 "name": "BaseBdev1", 00:14:12.013 "aliases": [ 00:14:12.013 "13b91221-c6d3-4382-abc3-f4821a5d0573" 00:14:12.013 ], 00:14:12.013 "product_name": "Malloc disk", 00:14:12.013 "block_size": 512, 00:14:12.013 "num_blocks": 65536, 00:14:12.013 "uuid": "13b91221-c6d3-4382-abc3-f4821a5d0573", 00:14:12.013 "assigned_rate_limits": { 00:14:12.013 "rw_ios_per_sec": 0, 00:14:12.013 "rw_mbytes_per_sec": 0, 00:14:12.013 "r_mbytes_per_sec": 0, 00:14:12.013 "w_mbytes_per_sec": 0 00:14:12.013 }, 00:14:12.013 "claimed": true, 00:14:12.013 "claim_type": "exclusive_write", 00:14:12.013 "zoned": false, 00:14:12.013 "supported_io_types": { 00:14:12.013 "read": true, 00:14:12.013 "write": true, 00:14:12.013 "unmap": true, 00:14:12.013 "flush": true, 00:14:12.013 "reset": true, 00:14:12.013 "nvme_admin": false, 00:14:12.013 "nvme_io": false, 00:14:12.013 "nvme_io_md": false, 00:14:12.013 "write_zeroes": true, 00:14:12.013 "zcopy": true, 00:14:12.013 "get_zone_info": false, 00:14:12.013 "zone_management": false, 00:14:12.013 "zone_append": false, 00:14:12.013 "compare": false, 00:14:12.013 "compare_and_write": false, 00:14:12.013 "abort": true, 00:14:12.013 "seek_hole": false, 00:14:12.013 "seek_data": false, 00:14:12.013 "copy": true, 00:14:12.013 "nvme_iov_md": false 00:14:12.013 }, 00:14:12.013 "memory_domains": [ 00:14:12.013 { 00:14:12.013 "dma_device_id": "system", 00:14:12.013 "dma_device_type": 1 00:14:12.013 }, 00:14:12.013 { 00:14:12.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.013 "dma_device_type": 2 00:14:12.013 } 00:14:12.013 ], 00:14:12.013 "driver_specific": {} 00:14:12.013 }' 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.013 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.271 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.271 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.271 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.271 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:12.271 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.568 "name": "BaseBdev2", 00:14:12.568 "aliases": [ 00:14:12.568 "2fb7c362-1d69-4ace-8a78-7512f9c5481e" 00:14:12.568 ], 00:14:12.568 "product_name": "Malloc disk", 00:14:12.568 "block_size": 512, 00:14:12.568 "num_blocks": 65536, 00:14:12.568 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:12.568 "assigned_rate_limits": { 00:14:12.568 "rw_ios_per_sec": 0, 00:14:12.568 "rw_mbytes_per_sec": 0, 00:14:12.568 "r_mbytes_per_sec": 0, 00:14:12.568 "w_mbytes_per_sec": 0 00:14:12.568 }, 00:14:12.568 "claimed": true, 00:14:12.568 "claim_type": "exclusive_write", 00:14:12.568 "zoned": false, 00:14:12.568 "supported_io_types": { 00:14:12.568 "read": true, 00:14:12.568 "write": true, 00:14:12.568 "unmap": true, 00:14:12.568 "flush": true, 00:14:12.568 "reset": true, 00:14:12.568 "nvme_admin": false, 00:14:12.568 "nvme_io": false, 00:14:12.568 "nvme_io_md": false, 00:14:12.568 "write_zeroes": true, 00:14:12.568 "zcopy": true, 00:14:12.568 "get_zone_info": false, 00:14:12.568 "zone_management": false, 00:14:12.568 "zone_append": false, 00:14:12.568 "compare": false, 00:14:12.568 "compare_and_write": false, 00:14:12.568 "abort": true, 00:14:12.568 "seek_hole": false, 00:14:12.568 "seek_data": false, 00:14:12.568 "copy": true, 00:14:12.568 "nvme_iov_md": false 00:14:12.568 }, 00:14:12.568 "memory_domains": [ 00:14:12.568 { 00:14:12.568 "dma_device_id": "system", 00:14:12.568 "dma_device_type": 1 00:14:12.568 }, 00:14:12.568 { 00:14:12.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.568 "dma_device_type": 2 00:14:12.568 } 00:14:12.568 ], 00:14:12.568 "driver_specific": {} 00:14:12.568 }' 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.568 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.840 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.840 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.840 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.840 22:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.840 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.840 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.840 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:12.840 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.098 "name": "BaseBdev3", 00:14:13.098 "aliases": [ 00:14:13.098 "25cc196a-8330-4f6e-8ead-dd994d31f54c" 00:14:13.098 ], 00:14:13.098 "product_name": "Malloc disk", 00:14:13.098 "block_size": 512, 00:14:13.098 "num_blocks": 65536, 00:14:13.098 "uuid": "25cc196a-8330-4f6e-8ead-dd994d31f54c", 00:14:13.098 "assigned_rate_limits": { 00:14:13.098 "rw_ios_per_sec": 0, 00:14:13.098 "rw_mbytes_per_sec": 0, 00:14:13.098 "r_mbytes_per_sec": 0, 00:14:13.098 "w_mbytes_per_sec": 0 00:14:13.098 }, 00:14:13.098 "claimed": true, 00:14:13.098 "claim_type": "exclusive_write", 00:14:13.098 "zoned": false, 00:14:13.098 "supported_io_types": { 00:14:13.098 "read": true, 00:14:13.098 "write": true, 00:14:13.098 "unmap": true, 00:14:13.098 "flush": true, 00:14:13.098 "reset": true, 00:14:13.098 "nvme_admin": false, 00:14:13.098 "nvme_io": false, 00:14:13.098 "nvme_io_md": false, 00:14:13.098 "write_zeroes": true, 00:14:13.098 "zcopy": true, 00:14:13.098 "get_zone_info": false, 00:14:13.098 "zone_management": false, 00:14:13.098 "zone_append": false, 00:14:13.098 "compare": false, 00:14:13.098 "compare_and_write": false, 00:14:13.098 "abort": true, 00:14:13.098 "seek_hole": false, 00:14:13.098 "seek_data": false, 00:14:13.098 "copy": true, 00:14:13.098 "nvme_iov_md": false 00:14:13.098 }, 00:14:13.098 "memory_domains": [ 00:14:13.098 { 00:14:13.098 "dma_device_id": "system", 00:14:13.098 "dma_device_type": 1 00:14:13.098 }, 00:14:13.098 { 00:14:13.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.098 "dma_device_type": 2 00:14:13.098 } 00:14:13.098 ], 00:14:13.098 "driver_specific": {} 00:14:13.098 }' 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.098 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.355 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:13.614 [2024-07-12 22:21:23.847330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:13.614 [2024-07-12 22:21:23.847364] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:13.614 [2024-07-12 22:21:23.847411] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.614 22:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.872 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.872 "name": "Existed_Raid", 00:14:13.873 "uuid": "9ecb4322-43e3-4cb3-90d3-2ae3dd5f629d", 00:14:13.873 "strip_size_kb": 64, 00:14:13.873 "state": "offline", 00:14:13.873 "raid_level": "raid0", 00:14:13.873 "superblock": false, 00:14:13.873 "num_base_bdevs": 3, 00:14:13.873 "num_base_bdevs_discovered": 2, 00:14:13.873 "num_base_bdevs_operational": 2, 00:14:13.873 "base_bdevs_list": [ 00:14:13.873 { 00:14:13.873 "name": null, 00:14:13.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.873 "is_configured": false, 00:14:13.873 "data_offset": 0, 00:14:13.873 "data_size": 65536 00:14:13.873 }, 00:14:13.873 { 00:14:13.873 "name": "BaseBdev2", 00:14:13.873 "uuid": "2fb7c362-1d69-4ace-8a78-7512f9c5481e", 00:14:13.873 "is_configured": true, 00:14:13.873 "data_offset": 0, 00:14:13.873 "data_size": 65536 00:14:13.873 }, 00:14:13.873 { 00:14:13.873 "name": "BaseBdev3", 00:14:13.873 "uuid": "25cc196a-8330-4f6e-8ead-dd994d31f54c", 00:14:13.873 "is_configured": true, 00:14:13.873 "data_offset": 0, 00:14:13.873 "data_size": 65536 00:14:13.873 } 00:14:13.873 ] 00:14:13.873 }' 00:14:13.873 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.873 22:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.438 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:14.438 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:14.438 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.438 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:14.696 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:14.696 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:14.696 22:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:14.954 [2024-07-12 22:21:25.180821] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:14.954 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:14.954 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:14.954 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.954 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:15.212 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:15.212 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:15.212 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:15.471 [2024-07-12 22:21:25.692946] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:15.471 [2024-07-12 22:21:25.692999] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e7400 name Existed_Raid, state offline 00:14:15.471 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:15.471 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:15.471 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.471 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:15.730 22:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:15.988 BaseBdev2 00:14:15.988 22:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:15.988 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:15.989 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:15.989 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:15.989 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:15.989 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:15.989 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:16.247 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:16.505 [ 00:14:16.505 { 00:14:16.505 "name": "BaseBdev2", 00:14:16.505 "aliases": [ 00:14:16.505 "e12a4169-1275-4155-ae29-0730bc56eafe" 00:14:16.505 ], 00:14:16.505 "product_name": "Malloc disk", 00:14:16.505 "block_size": 512, 00:14:16.505 "num_blocks": 65536, 00:14:16.505 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:16.505 "assigned_rate_limits": { 00:14:16.505 "rw_ios_per_sec": 0, 00:14:16.505 "rw_mbytes_per_sec": 0, 00:14:16.505 "r_mbytes_per_sec": 0, 00:14:16.505 "w_mbytes_per_sec": 0 00:14:16.505 }, 00:14:16.505 "claimed": false, 00:14:16.505 "zoned": false, 00:14:16.505 "supported_io_types": { 00:14:16.505 "read": true, 00:14:16.505 "write": true, 00:14:16.505 "unmap": true, 00:14:16.505 "flush": true, 00:14:16.505 "reset": true, 00:14:16.505 "nvme_admin": false, 00:14:16.505 "nvme_io": false, 00:14:16.505 "nvme_io_md": false, 00:14:16.505 "write_zeroes": true, 00:14:16.505 "zcopy": true, 00:14:16.505 "get_zone_info": false, 00:14:16.505 "zone_management": false, 00:14:16.505 "zone_append": false, 00:14:16.505 "compare": false, 00:14:16.505 "compare_and_write": false, 00:14:16.505 "abort": true, 00:14:16.505 "seek_hole": false, 00:14:16.505 "seek_data": false, 00:14:16.505 "copy": true, 00:14:16.505 "nvme_iov_md": false 00:14:16.505 }, 00:14:16.505 "memory_domains": [ 00:14:16.505 { 00:14:16.505 "dma_device_id": "system", 00:14:16.505 "dma_device_type": 1 00:14:16.505 }, 00:14:16.505 { 00:14:16.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.505 "dma_device_type": 2 00:14:16.505 } 00:14:16.505 ], 00:14:16.505 "driver_specific": {} 00:14:16.505 } 00:14:16.505 ] 00:14:16.505 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:16.505 22:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:16.505 22:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:16.505 22:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:16.764 BaseBdev3 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:16.764 22:21:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.022 22:21:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:17.280 [ 00:14:17.280 { 00:14:17.280 "name": "BaseBdev3", 00:14:17.280 "aliases": [ 00:14:17.280 "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3" 00:14:17.280 ], 00:14:17.280 "product_name": "Malloc disk", 00:14:17.280 "block_size": 512, 00:14:17.280 "num_blocks": 65536, 00:14:17.280 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:17.281 "assigned_rate_limits": { 00:14:17.281 "rw_ios_per_sec": 0, 00:14:17.281 "rw_mbytes_per_sec": 0, 00:14:17.281 "r_mbytes_per_sec": 0, 00:14:17.281 "w_mbytes_per_sec": 0 00:14:17.281 }, 00:14:17.281 "claimed": false, 00:14:17.281 "zoned": false, 00:14:17.281 "supported_io_types": { 00:14:17.281 "read": true, 00:14:17.281 "write": true, 00:14:17.281 "unmap": true, 00:14:17.281 "flush": true, 00:14:17.281 "reset": true, 00:14:17.281 "nvme_admin": false, 00:14:17.281 "nvme_io": false, 00:14:17.281 "nvme_io_md": false, 00:14:17.281 "write_zeroes": true, 00:14:17.281 "zcopy": true, 00:14:17.281 "get_zone_info": false, 00:14:17.281 "zone_management": false, 00:14:17.281 "zone_append": false, 00:14:17.281 "compare": false, 00:14:17.281 "compare_and_write": false, 00:14:17.281 "abort": true, 00:14:17.281 "seek_hole": false, 00:14:17.281 "seek_data": false, 00:14:17.281 "copy": true, 00:14:17.281 "nvme_iov_md": false 00:14:17.281 }, 00:14:17.281 "memory_domains": [ 00:14:17.281 { 00:14:17.281 "dma_device_id": "system", 00:14:17.281 "dma_device_type": 1 00:14:17.281 }, 00:14:17.281 { 00:14:17.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.281 "dma_device_type": 2 00:14:17.281 } 00:14:17.281 ], 00:14:17.281 "driver_specific": {} 00:14:17.281 } 00:14:17.281 ] 00:14:17.281 22:21:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:17.281 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:17.281 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:17.281 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:17.540 [2024-07-12 22:21:27.612595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:17.540 [2024-07-12 22:21:27.612641] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:17.540 [2024-07-12 22:21:27.612663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:17.540 [2024-07-12 22:21:27.614045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.540 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.798 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.798 "name": "Existed_Raid", 00:14:17.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.798 "strip_size_kb": 64, 00:14:17.798 "state": "configuring", 00:14:17.798 "raid_level": "raid0", 00:14:17.798 "superblock": false, 00:14:17.798 "num_base_bdevs": 3, 00:14:17.798 "num_base_bdevs_discovered": 2, 00:14:17.798 "num_base_bdevs_operational": 3, 00:14:17.798 "base_bdevs_list": [ 00:14:17.798 { 00:14:17.798 "name": "BaseBdev1", 00:14:17.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.798 "is_configured": false, 00:14:17.798 "data_offset": 0, 00:14:17.798 "data_size": 0 00:14:17.798 }, 00:14:17.798 { 00:14:17.798 "name": "BaseBdev2", 00:14:17.798 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:17.798 "is_configured": true, 00:14:17.798 "data_offset": 0, 00:14:17.798 "data_size": 65536 00:14:17.798 }, 00:14:17.798 { 00:14:17.798 "name": "BaseBdev3", 00:14:17.799 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:17.799 "is_configured": true, 00:14:17.799 "data_offset": 0, 00:14:17.799 "data_size": 65536 00:14:17.799 } 00:14:17.799 ] 00:14:17.799 }' 00:14:17.799 22:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.799 22:21:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.364 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:18.364 [2024-07-12 22:21:28.675391] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.622 "name": "Existed_Raid", 00:14:18.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.622 "strip_size_kb": 64, 00:14:18.622 "state": "configuring", 00:14:18.622 "raid_level": "raid0", 00:14:18.622 "superblock": false, 00:14:18.622 "num_base_bdevs": 3, 00:14:18.622 "num_base_bdevs_discovered": 1, 00:14:18.622 "num_base_bdevs_operational": 3, 00:14:18.622 "base_bdevs_list": [ 00:14:18.622 { 00:14:18.622 "name": "BaseBdev1", 00:14:18.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.622 "is_configured": false, 00:14:18.622 "data_offset": 0, 00:14:18.622 "data_size": 0 00:14:18.622 }, 00:14:18.622 { 00:14:18.622 "name": null, 00:14:18.622 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:18.622 "is_configured": false, 00:14:18.622 "data_offset": 0, 00:14:18.622 "data_size": 65536 00:14:18.622 }, 00:14:18.622 { 00:14:18.622 "name": "BaseBdev3", 00:14:18.622 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:18.622 "is_configured": true, 00:14:18.622 "data_offset": 0, 00:14:18.622 "data_size": 65536 00:14:18.622 } 00:14:18.622 ] 00:14:18.622 }' 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.622 22:21:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.190 22:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.190 22:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:19.757 22:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:19.757 22:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:20.015 [2024-07-12 22:21:30.123140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:20.015 BaseBdev1 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:20.015 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:20.274 [ 00:14:20.274 { 00:14:20.274 "name": "BaseBdev1", 00:14:20.274 "aliases": [ 00:14:20.274 "65de7bc5-ff87-40f7-99bb-9133a3eb55f9" 00:14:20.274 ], 00:14:20.274 "product_name": "Malloc disk", 00:14:20.274 "block_size": 512, 00:14:20.274 "num_blocks": 65536, 00:14:20.274 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:20.274 "assigned_rate_limits": { 00:14:20.274 "rw_ios_per_sec": 0, 00:14:20.274 "rw_mbytes_per_sec": 0, 00:14:20.274 "r_mbytes_per_sec": 0, 00:14:20.274 "w_mbytes_per_sec": 0 00:14:20.274 }, 00:14:20.274 "claimed": true, 00:14:20.274 "claim_type": "exclusive_write", 00:14:20.274 "zoned": false, 00:14:20.274 "supported_io_types": { 00:14:20.274 "read": true, 00:14:20.274 "write": true, 00:14:20.274 "unmap": true, 00:14:20.274 "flush": true, 00:14:20.274 "reset": true, 00:14:20.274 "nvme_admin": false, 00:14:20.274 "nvme_io": false, 00:14:20.274 "nvme_io_md": false, 00:14:20.274 "write_zeroes": true, 00:14:20.274 "zcopy": true, 00:14:20.274 "get_zone_info": false, 00:14:20.274 "zone_management": false, 00:14:20.274 "zone_append": false, 00:14:20.274 "compare": false, 00:14:20.274 "compare_and_write": false, 00:14:20.274 "abort": true, 00:14:20.274 "seek_hole": false, 00:14:20.274 "seek_data": false, 00:14:20.274 "copy": true, 00:14:20.274 "nvme_iov_md": false 00:14:20.274 }, 00:14:20.274 "memory_domains": [ 00:14:20.274 { 00:14:20.274 "dma_device_id": "system", 00:14:20.274 "dma_device_type": 1 00:14:20.274 }, 00:14:20.274 { 00:14:20.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.274 "dma_device_type": 2 00:14:20.274 } 00:14:20.274 ], 00:14:20.274 "driver_specific": {} 00:14:20.274 } 00:14:20.274 ] 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.274 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.275 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.275 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.275 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.275 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.275 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.533 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.533 "name": "Existed_Raid", 00:14:20.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.533 "strip_size_kb": 64, 00:14:20.533 "state": "configuring", 00:14:20.533 "raid_level": "raid0", 00:14:20.533 "superblock": false, 00:14:20.533 "num_base_bdevs": 3, 00:14:20.533 "num_base_bdevs_discovered": 2, 00:14:20.533 "num_base_bdevs_operational": 3, 00:14:20.533 "base_bdevs_list": [ 00:14:20.533 { 00:14:20.533 "name": "BaseBdev1", 00:14:20.533 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:20.533 "is_configured": true, 00:14:20.533 "data_offset": 0, 00:14:20.533 "data_size": 65536 00:14:20.533 }, 00:14:20.533 { 00:14:20.533 "name": null, 00:14:20.533 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:20.533 "is_configured": false, 00:14:20.533 "data_offset": 0, 00:14:20.533 "data_size": 65536 00:14:20.533 }, 00:14:20.533 { 00:14:20.533 "name": "BaseBdev3", 00:14:20.533 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:20.533 "is_configured": true, 00:14:20.533 "data_offset": 0, 00:14:20.533 "data_size": 65536 00:14:20.534 } 00:14:20.534 ] 00:14:20.534 }' 00:14:20.534 22:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.534 22:21:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.470 22:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.470 22:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:21.729 22:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:21.729 22:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:21.729 [2024-07-12 22:21:32.036222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.729 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.989 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.989 "name": "Existed_Raid", 00:14:21.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.989 "strip_size_kb": 64, 00:14:21.989 "state": "configuring", 00:14:21.989 "raid_level": "raid0", 00:14:21.989 "superblock": false, 00:14:21.989 "num_base_bdevs": 3, 00:14:21.989 "num_base_bdevs_discovered": 1, 00:14:21.989 "num_base_bdevs_operational": 3, 00:14:21.989 "base_bdevs_list": [ 00:14:21.989 { 00:14:21.989 "name": "BaseBdev1", 00:14:21.989 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:21.989 "is_configured": true, 00:14:21.989 "data_offset": 0, 00:14:21.989 "data_size": 65536 00:14:21.989 }, 00:14:21.989 { 00:14:21.989 "name": null, 00:14:21.989 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:21.989 "is_configured": false, 00:14:21.989 "data_offset": 0, 00:14:21.989 "data_size": 65536 00:14:21.989 }, 00:14:21.989 { 00:14:21.989 "name": null, 00:14:21.989 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:21.989 "is_configured": false, 00:14:21.989 "data_offset": 0, 00:14:21.989 "data_size": 65536 00:14:21.989 } 00:14:21.989 ] 00:14:21.989 }' 00:14:21.989 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.989 22:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.555 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.813 22:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:22.813 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:22.813 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:23.071 [2024-07-12 22:21:33.327672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.071 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.638 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.638 "name": "Existed_Raid", 00:14:23.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.638 "strip_size_kb": 64, 00:14:23.638 "state": "configuring", 00:14:23.638 "raid_level": "raid0", 00:14:23.638 "superblock": false, 00:14:23.638 "num_base_bdevs": 3, 00:14:23.638 "num_base_bdevs_discovered": 2, 00:14:23.638 "num_base_bdevs_operational": 3, 00:14:23.638 "base_bdevs_list": [ 00:14:23.638 { 00:14:23.638 "name": "BaseBdev1", 00:14:23.638 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:23.638 "is_configured": true, 00:14:23.638 "data_offset": 0, 00:14:23.638 "data_size": 65536 00:14:23.638 }, 00:14:23.638 { 00:14:23.638 "name": null, 00:14:23.638 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:23.638 "is_configured": false, 00:14:23.638 "data_offset": 0, 00:14:23.638 "data_size": 65536 00:14:23.638 }, 00:14:23.638 { 00:14:23.638 "name": "BaseBdev3", 00:14:23.638 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:23.638 "is_configured": true, 00:14:23.638 "data_offset": 0, 00:14:23.638 "data_size": 65536 00:14:23.638 } 00:14:23.638 ] 00:14:23.638 }' 00:14:23.638 22:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.638 22:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.205 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:24.205 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.466 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:24.466 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:24.725 [2024-07-12 22:21:34.920003] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.725 22:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.290 22:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.290 "name": "Existed_Raid", 00:14:25.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.290 "strip_size_kb": 64, 00:14:25.290 "state": "configuring", 00:14:25.290 "raid_level": "raid0", 00:14:25.290 "superblock": false, 00:14:25.290 "num_base_bdevs": 3, 00:14:25.290 "num_base_bdevs_discovered": 1, 00:14:25.290 "num_base_bdevs_operational": 3, 00:14:25.290 "base_bdevs_list": [ 00:14:25.290 { 00:14:25.290 "name": null, 00:14:25.290 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:25.290 "is_configured": false, 00:14:25.290 "data_offset": 0, 00:14:25.290 "data_size": 65536 00:14:25.290 }, 00:14:25.290 { 00:14:25.290 "name": null, 00:14:25.290 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:25.290 "is_configured": false, 00:14:25.290 "data_offset": 0, 00:14:25.290 "data_size": 65536 00:14:25.290 }, 00:14:25.290 { 00:14:25.290 "name": "BaseBdev3", 00:14:25.290 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:25.290 "is_configured": true, 00:14:25.290 "data_offset": 0, 00:14:25.290 "data_size": 65536 00:14:25.290 } 00:14:25.290 ] 00:14:25.290 }' 00:14:25.290 22:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.290 22:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.854 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:25.854 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.111 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:26.111 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:26.398 [2024-07-12 22:21:36.440535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.398 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.399 "name": "Existed_Raid", 00:14:26.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.399 "strip_size_kb": 64, 00:14:26.399 "state": "configuring", 00:14:26.399 "raid_level": "raid0", 00:14:26.399 "superblock": false, 00:14:26.399 "num_base_bdevs": 3, 00:14:26.399 "num_base_bdevs_discovered": 2, 00:14:26.399 "num_base_bdevs_operational": 3, 00:14:26.399 "base_bdevs_list": [ 00:14:26.399 { 00:14:26.399 "name": null, 00:14:26.399 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:26.399 "is_configured": false, 00:14:26.399 "data_offset": 0, 00:14:26.399 "data_size": 65536 00:14:26.399 }, 00:14:26.399 { 00:14:26.399 "name": "BaseBdev2", 00:14:26.399 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:26.399 "is_configured": true, 00:14:26.399 "data_offset": 0, 00:14:26.399 "data_size": 65536 00:14:26.399 }, 00:14:26.399 { 00:14:26.399 "name": "BaseBdev3", 00:14:26.399 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:26.399 "is_configured": true, 00:14:26.399 "data_offset": 0, 00:14:26.399 "data_size": 65536 00:14:26.399 } 00:14:26.399 ] 00:14:26.399 }' 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.399 22:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.332 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.332 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:27.332 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:27.332 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.332 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:27.590 22:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 65de7bc5-ff87-40f7-99bb-9133a3eb55f9 00:14:27.848 [2024-07-12 22:21:38.017324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:27.848 [2024-07-12 22:21:38.017370] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e5450 00:14:27.848 [2024-07-12 22:21:38.017379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:27.848 [2024-07-12 22:21:38.017578] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e6a50 00:14:27.848 [2024-07-12 22:21:38.017695] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e5450 00:14:27.848 [2024-07-12 22:21:38.017705] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18e5450 00:14:27.848 [2024-07-12 22:21:38.017883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.848 NewBaseBdev 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:27.848 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.106 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:28.365 [ 00:14:28.365 { 00:14:28.365 "name": "NewBaseBdev", 00:14:28.365 "aliases": [ 00:14:28.365 "65de7bc5-ff87-40f7-99bb-9133a3eb55f9" 00:14:28.365 ], 00:14:28.365 "product_name": "Malloc disk", 00:14:28.365 "block_size": 512, 00:14:28.365 "num_blocks": 65536, 00:14:28.365 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:28.365 "assigned_rate_limits": { 00:14:28.365 "rw_ios_per_sec": 0, 00:14:28.365 "rw_mbytes_per_sec": 0, 00:14:28.365 "r_mbytes_per_sec": 0, 00:14:28.365 "w_mbytes_per_sec": 0 00:14:28.365 }, 00:14:28.365 "claimed": true, 00:14:28.365 "claim_type": "exclusive_write", 00:14:28.365 "zoned": false, 00:14:28.365 "supported_io_types": { 00:14:28.365 "read": true, 00:14:28.365 "write": true, 00:14:28.365 "unmap": true, 00:14:28.365 "flush": true, 00:14:28.365 "reset": true, 00:14:28.365 "nvme_admin": false, 00:14:28.365 "nvme_io": false, 00:14:28.365 "nvme_io_md": false, 00:14:28.365 "write_zeroes": true, 00:14:28.365 "zcopy": true, 00:14:28.365 "get_zone_info": false, 00:14:28.365 "zone_management": false, 00:14:28.365 "zone_append": false, 00:14:28.365 "compare": false, 00:14:28.365 "compare_and_write": false, 00:14:28.365 "abort": true, 00:14:28.365 "seek_hole": false, 00:14:28.365 "seek_data": false, 00:14:28.365 "copy": true, 00:14:28.365 "nvme_iov_md": false 00:14:28.365 }, 00:14:28.365 "memory_domains": [ 00:14:28.365 { 00:14:28.365 "dma_device_id": "system", 00:14:28.365 "dma_device_type": 1 00:14:28.365 }, 00:14:28.365 { 00:14:28.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.365 "dma_device_type": 2 00:14:28.365 } 00:14:28.365 ], 00:14:28.365 "driver_specific": {} 00:14:28.365 } 00:14:28.365 ] 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.365 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.624 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.624 "name": "Existed_Raid", 00:14:28.624 "uuid": "8e2c7c1e-2fa8-47cd-a007-7ab1099553bc", 00:14:28.624 "strip_size_kb": 64, 00:14:28.624 "state": "online", 00:14:28.624 "raid_level": "raid0", 00:14:28.624 "superblock": false, 00:14:28.624 "num_base_bdevs": 3, 00:14:28.624 "num_base_bdevs_discovered": 3, 00:14:28.624 "num_base_bdevs_operational": 3, 00:14:28.624 "base_bdevs_list": [ 00:14:28.624 { 00:14:28.624 "name": "NewBaseBdev", 00:14:28.624 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:28.624 "is_configured": true, 00:14:28.624 "data_offset": 0, 00:14:28.624 "data_size": 65536 00:14:28.624 }, 00:14:28.624 { 00:14:28.624 "name": "BaseBdev2", 00:14:28.624 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:28.624 "is_configured": true, 00:14:28.624 "data_offset": 0, 00:14:28.624 "data_size": 65536 00:14:28.624 }, 00:14:28.624 { 00:14:28.624 "name": "BaseBdev3", 00:14:28.624 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:28.624 "is_configured": true, 00:14:28.624 "data_offset": 0, 00:14:28.624 "data_size": 65536 00:14:28.624 } 00:14:28.624 ] 00:14:28.624 }' 00:14:28.624 22:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.624 22:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:29.191 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:29.449 [2024-07-12 22:21:39.577763] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:29.449 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:29.449 "name": "Existed_Raid", 00:14:29.449 "aliases": [ 00:14:29.449 "8e2c7c1e-2fa8-47cd-a007-7ab1099553bc" 00:14:29.449 ], 00:14:29.449 "product_name": "Raid Volume", 00:14:29.449 "block_size": 512, 00:14:29.449 "num_blocks": 196608, 00:14:29.449 "uuid": "8e2c7c1e-2fa8-47cd-a007-7ab1099553bc", 00:14:29.449 "assigned_rate_limits": { 00:14:29.449 "rw_ios_per_sec": 0, 00:14:29.449 "rw_mbytes_per_sec": 0, 00:14:29.449 "r_mbytes_per_sec": 0, 00:14:29.449 "w_mbytes_per_sec": 0 00:14:29.449 }, 00:14:29.449 "claimed": false, 00:14:29.449 "zoned": false, 00:14:29.449 "supported_io_types": { 00:14:29.449 "read": true, 00:14:29.449 "write": true, 00:14:29.449 "unmap": true, 00:14:29.449 "flush": true, 00:14:29.449 "reset": true, 00:14:29.449 "nvme_admin": false, 00:14:29.449 "nvme_io": false, 00:14:29.449 "nvme_io_md": false, 00:14:29.449 "write_zeroes": true, 00:14:29.449 "zcopy": false, 00:14:29.449 "get_zone_info": false, 00:14:29.449 "zone_management": false, 00:14:29.449 "zone_append": false, 00:14:29.449 "compare": false, 00:14:29.449 "compare_and_write": false, 00:14:29.449 "abort": false, 00:14:29.449 "seek_hole": false, 00:14:29.449 "seek_data": false, 00:14:29.449 "copy": false, 00:14:29.449 "nvme_iov_md": false 00:14:29.449 }, 00:14:29.449 "memory_domains": [ 00:14:29.449 { 00:14:29.449 "dma_device_id": "system", 00:14:29.449 "dma_device_type": 1 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.449 "dma_device_type": 2 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "dma_device_id": "system", 00:14:29.449 "dma_device_type": 1 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.449 "dma_device_type": 2 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "dma_device_id": "system", 00:14:29.449 "dma_device_type": 1 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.449 "dma_device_type": 2 00:14:29.449 } 00:14:29.449 ], 00:14:29.449 "driver_specific": { 00:14:29.449 "raid": { 00:14:29.449 "uuid": "8e2c7c1e-2fa8-47cd-a007-7ab1099553bc", 00:14:29.449 "strip_size_kb": 64, 00:14:29.449 "state": "online", 00:14:29.449 "raid_level": "raid0", 00:14:29.449 "superblock": false, 00:14:29.449 "num_base_bdevs": 3, 00:14:29.449 "num_base_bdevs_discovered": 3, 00:14:29.449 "num_base_bdevs_operational": 3, 00:14:29.449 "base_bdevs_list": [ 00:14:29.449 { 00:14:29.449 "name": "NewBaseBdev", 00:14:29.449 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:29.449 "is_configured": true, 00:14:29.449 "data_offset": 0, 00:14:29.449 "data_size": 65536 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "name": "BaseBdev2", 00:14:29.449 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:29.449 "is_configured": true, 00:14:29.449 "data_offset": 0, 00:14:29.449 "data_size": 65536 00:14:29.449 }, 00:14:29.449 { 00:14:29.449 "name": "BaseBdev3", 00:14:29.450 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:29.450 "is_configured": true, 00:14:29.450 "data_offset": 0, 00:14:29.450 "data_size": 65536 00:14:29.450 } 00:14:29.450 ] 00:14:29.450 } 00:14:29.450 } 00:14:29.450 }' 00:14:29.450 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:29.450 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:29.450 BaseBdev2 00:14:29.450 BaseBdev3' 00:14:29.450 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.450 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:29.450 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:29.708 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:29.708 "name": "NewBaseBdev", 00:14:29.708 "aliases": [ 00:14:29.708 "65de7bc5-ff87-40f7-99bb-9133a3eb55f9" 00:14:29.708 ], 00:14:29.708 "product_name": "Malloc disk", 00:14:29.708 "block_size": 512, 00:14:29.708 "num_blocks": 65536, 00:14:29.708 "uuid": "65de7bc5-ff87-40f7-99bb-9133a3eb55f9", 00:14:29.708 "assigned_rate_limits": { 00:14:29.708 "rw_ios_per_sec": 0, 00:14:29.708 "rw_mbytes_per_sec": 0, 00:14:29.708 "r_mbytes_per_sec": 0, 00:14:29.708 "w_mbytes_per_sec": 0 00:14:29.708 }, 00:14:29.708 "claimed": true, 00:14:29.708 "claim_type": "exclusive_write", 00:14:29.708 "zoned": false, 00:14:29.708 "supported_io_types": { 00:14:29.708 "read": true, 00:14:29.708 "write": true, 00:14:29.708 "unmap": true, 00:14:29.708 "flush": true, 00:14:29.708 "reset": true, 00:14:29.708 "nvme_admin": false, 00:14:29.708 "nvme_io": false, 00:14:29.708 "nvme_io_md": false, 00:14:29.708 "write_zeroes": true, 00:14:29.708 "zcopy": true, 00:14:29.708 "get_zone_info": false, 00:14:29.708 "zone_management": false, 00:14:29.708 "zone_append": false, 00:14:29.708 "compare": false, 00:14:29.708 "compare_and_write": false, 00:14:29.708 "abort": true, 00:14:29.708 "seek_hole": false, 00:14:29.708 "seek_data": false, 00:14:29.708 "copy": true, 00:14:29.708 "nvme_iov_md": false 00:14:29.708 }, 00:14:29.708 "memory_domains": [ 00:14:29.708 { 00:14:29.708 "dma_device_id": "system", 00:14:29.708 "dma_device_type": 1 00:14:29.708 }, 00:14:29.708 { 00:14:29.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.708 "dma_device_type": 2 00:14:29.708 } 00:14:29.708 ], 00:14:29.708 "driver_specific": {} 00:14:29.708 }' 00:14:29.708 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.708 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.708 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.708 22:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.708 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:29.966 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:30.224 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:30.224 "name": "BaseBdev2", 00:14:30.224 "aliases": [ 00:14:30.224 "e12a4169-1275-4155-ae29-0730bc56eafe" 00:14:30.224 ], 00:14:30.224 "product_name": "Malloc disk", 00:14:30.224 "block_size": 512, 00:14:30.224 "num_blocks": 65536, 00:14:30.224 "uuid": "e12a4169-1275-4155-ae29-0730bc56eafe", 00:14:30.224 "assigned_rate_limits": { 00:14:30.224 "rw_ios_per_sec": 0, 00:14:30.224 "rw_mbytes_per_sec": 0, 00:14:30.224 "r_mbytes_per_sec": 0, 00:14:30.224 "w_mbytes_per_sec": 0 00:14:30.224 }, 00:14:30.224 "claimed": true, 00:14:30.224 "claim_type": "exclusive_write", 00:14:30.224 "zoned": false, 00:14:30.224 "supported_io_types": { 00:14:30.224 "read": true, 00:14:30.224 "write": true, 00:14:30.224 "unmap": true, 00:14:30.224 "flush": true, 00:14:30.224 "reset": true, 00:14:30.224 "nvme_admin": false, 00:14:30.224 "nvme_io": false, 00:14:30.224 "nvme_io_md": false, 00:14:30.224 "write_zeroes": true, 00:14:30.224 "zcopy": true, 00:14:30.224 "get_zone_info": false, 00:14:30.224 "zone_management": false, 00:14:30.224 "zone_append": false, 00:14:30.224 "compare": false, 00:14:30.224 "compare_and_write": false, 00:14:30.224 "abort": true, 00:14:30.224 "seek_hole": false, 00:14:30.224 "seek_data": false, 00:14:30.224 "copy": true, 00:14:30.224 "nvme_iov_md": false 00:14:30.224 }, 00:14:30.224 "memory_domains": [ 00:14:30.224 { 00:14:30.224 "dma_device_id": "system", 00:14:30.224 "dma_device_type": 1 00:14:30.224 }, 00:14:30.224 { 00:14:30.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.224 "dma_device_type": 2 00:14:30.224 } 00:14:30.224 ], 00:14:30.224 "driver_specific": {} 00:14:30.224 }' 00:14:30.224 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.224 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.224 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:30.224 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:30.482 22:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:30.741 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:30.741 "name": "BaseBdev3", 00:14:30.741 "aliases": [ 00:14:30.741 "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3" 00:14:30.741 ], 00:14:30.741 "product_name": "Malloc disk", 00:14:30.741 "block_size": 512, 00:14:30.741 "num_blocks": 65536, 00:14:30.741 "uuid": "9b16c65d-4f00-4f1e-ba7c-b1b7436b5ea3", 00:14:30.741 "assigned_rate_limits": { 00:14:30.741 "rw_ios_per_sec": 0, 00:14:30.741 "rw_mbytes_per_sec": 0, 00:14:30.741 "r_mbytes_per_sec": 0, 00:14:30.741 "w_mbytes_per_sec": 0 00:14:30.741 }, 00:14:30.741 "claimed": true, 00:14:30.741 "claim_type": "exclusive_write", 00:14:30.741 "zoned": false, 00:14:30.741 "supported_io_types": { 00:14:30.741 "read": true, 00:14:30.741 "write": true, 00:14:30.741 "unmap": true, 00:14:30.741 "flush": true, 00:14:30.741 "reset": true, 00:14:30.741 "nvme_admin": false, 00:14:30.741 "nvme_io": false, 00:14:30.741 "nvme_io_md": false, 00:14:30.741 "write_zeroes": true, 00:14:30.741 "zcopy": true, 00:14:30.741 "get_zone_info": false, 00:14:30.741 "zone_management": false, 00:14:30.741 "zone_append": false, 00:14:30.741 "compare": false, 00:14:30.741 "compare_and_write": false, 00:14:30.741 "abort": true, 00:14:30.741 "seek_hole": false, 00:14:30.741 "seek_data": false, 00:14:30.741 "copy": true, 00:14:30.741 "nvme_iov_md": false 00:14:30.741 }, 00:14:30.741 "memory_domains": [ 00:14:30.741 { 00:14:30.741 "dma_device_id": "system", 00:14:30.741 "dma_device_type": 1 00:14:30.741 }, 00:14:30.741 { 00:14:30.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.741 "dma_device_type": 2 00:14:30.741 } 00:14:30.741 ], 00:14:30.741 "driver_specific": {} 00:14:30.741 }' 00:14:30.741 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.741 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.999 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:31.257 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:31.257 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:31.257 [2024-07-12 22:21:41.570796] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:31.257 [2024-07-12 22:21:41.570828] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.257 [2024-07-12 22:21:41.570886] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.257 [2024-07-12 22:21:41.570947] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:31.257 [2024-07-12 22:21:41.570960] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e5450 name Existed_Raid, state offline 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3441936 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3441936 ']' 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3441936 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3441936 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3441936' 00:14:31.515 killing process with pid 3441936 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3441936 00:14:31.515 [2024-07-12 22:21:41.637223] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:31.515 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3441936 00:14:31.515 [2024-07-12 22:21:41.668478] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:31.773 22:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:31.773 00:14:31.773 real 0m28.513s 00:14:31.773 user 0m52.406s 00:14:31.773 sys 0m4.971s 00:14:31.773 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.774 ************************************ 00:14:31.774 END TEST raid_state_function_test 00:14:31.774 ************************************ 00:14:31.774 22:21:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:31.774 22:21:41 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:31.774 22:21:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:31.774 22:21:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:31.774 22:21:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:31.774 ************************************ 00:14:31.774 START TEST raid_state_function_test_sb 00:14:31.774 ************************************ 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3446248 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3446248' 00:14:31.774 Process raid pid: 3446248 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3446248 /var/tmp/spdk-raid.sock 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3446248 ']' 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:31.774 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:31.774 22:21:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.774 [2024-07-12 22:21:42.057501] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:14:31.774 [2024-07-12 22:21:42.057574] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:32.032 [2024-07-12 22:21:42.188146] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:32.032 [2024-07-12 22:21:42.285352] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:32.032 [2024-07-12 22:21:42.353498] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.032 [2024-07-12 22:21:42.353537] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.600 22:21:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:32.600 22:21:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:32.600 22:21:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:32.869 [2024-07-12 22:21:43.136955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:32.869 [2024-07-12 22:21:43.137006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:32.869 [2024-07-12 22:21:43.137018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:32.869 [2024-07-12 22:21:43.137030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:32.869 [2024-07-12 22:21:43.137039] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:32.869 [2024-07-12 22:21:43.137051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.869 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.143 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.143 "name": "Existed_Raid", 00:14:33.143 "uuid": "6795aba0-030a-4e5d-a3e5-ce088084a886", 00:14:33.143 "strip_size_kb": 64, 00:14:33.143 "state": "configuring", 00:14:33.143 "raid_level": "raid0", 00:14:33.143 "superblock": true, 00:14:33.143 "num_base_bdevs": 3, 00:14:33.143 "num_base_bdevs_discovered": 0, 00:14:33.143 "num_base_bdevs_operational": 3, 00:14:33.143 "base_bdevs_list": [ 00:14:33.143 { 00:14:33.143 "name": "BaseBdev1", 00:14:33.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.143 "is_configured": false, 00:14:33.143 "data_offset": 0, 00:14:33.143 "data_size": 0 00:14:33.143 }, 00:14:33.143 { 00:14:33.143 "name": "BaseBdev2", 00:14:33.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.143 "is_configured": false, 00:14:33.143 "data_offset": 0, 00:14:33.143 "data_size": 0 00:14:33.143 }, 00:14:33.143 { 00:14:33.143 "name": "BaseBdev3", 00:14:33.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.143 "is_configured": false, 00:14:33.143 "data_offset": 0, 00:14:33.143 "data_size": 0 00:14:33.143 } 00:14:33.143 ] 00:14:33.143 }' 00:14:33.143 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.143 22:21:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.709 22:21:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:33.967 [2024-07-12 22:21:44.183537] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:33.967 [2024-07-12 22:21:44.183573] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dfea80 name Existed_Raid, state configuring 00:14:33.968 22:21:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:34.225 [2024-07-12 22:21:44.424212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:34.225 [2024-07-12 22:21:44.424247] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:34.225 [2024-07-12 22:21:44.424257] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:34.226 [2024-07-12 22:21:44.424269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:34.226 [2024-07-12 22:21:44.424277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:34.226 [2024-07-12 22:21:44.424288] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:34.226 22:21:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:34.484 [2024-07-12 22:21:44.678753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:34.484 BaseBdev1 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:34.484 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.743 22:21:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:35.002 [ 00:14:35.002 { 00:14:35.002 "name": "BaseBdev1", 00:14:35.002 "aliases": [ 00:14:35.002 "22bcef33-eff3-4a52-9d13-eaf66f629331" 00:14:35.002 ], 00:14:35.002 "product_name": "Malloc disk", 00:14:35.002 "block_size": 512, 00:14:35.002 "num_blocks": 65536, 00:14:35.002 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:35.002 "assigned_rate_limits": { 00:14:35.002 "rw_ios_per_sec": 0, 00:14:35.002 "rw_mbytes_per_sec": 0, 00:14:35.002 "r_mbytes_per_sec": 0, 00:14:35.002 "w_mbytes_per_sec": 0 00:14:35.002 }, 00:14:35.002 "claimed": true, 00:14:35.002 "claim_type": "exclusive_write", 00:14:35.002 "zoned": false, 00:14:35.002 "supported_io_types": { 00:14:35.002 "read": true, 00:14:35.002 "write": true, 00:14:35.002 "unmap": true, 00:14:35.002 "flush": true, 00:14:35.002 "reset": true, 00:14:35.002 "nvme_admin": false, 00:14:35.002 "nvme_io": false, 00:14:35.002 "nvme_io_md": false, 00:14:35.002 "write_zeroes": true, 00:14:35.002 "zcopy": true, 00:14:35.002 "get_zone_info": false, 00:14:35.002 "zone_management": false, 00:14:35.002 "zone_append": false, 00:14:35.002 "compare": false, 00:14:35.002 "compare_and_write": false, 00:14:35.002 "abort": true, 00:14:35.002 "seek_hole": false, 00:14:35.002 "seek_data": false, 00:14:35.002 "copy": true, 00:14:35.002 "nvme_iov_md": false 00:14:35.002 }, 00:14:35.002 "memory_domains": [ 00:14:35.002 { 00:14:35.002 "dma_device_id": "system", 00:14:35.002 "dma_device_type": 1 00:14:35.002 }, 00:14:35.002 { 00:14:35.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.002 "dma_device_type": 2 00:14:35.002 } 00:14:35.002 ], 00:14:35.002 "driver_specific": {} 00:14:35.002 } 00:14:35.002 ] 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.002 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.262 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.262 "name": "Existed_Raid", 00:14:35.262 "uuid": "882fe19f-461d-42d8-985c-4ab828461209", 00:14:35.262 "strip_size_kb": 64, 00:14:35.262 "state": "configuring", 00:14:35.262 "raid_level": "raid0", 00:14:35.262 "superblock": true, 00:14:35.262 "num_base_bdevs": 3, 00:14:35.262 "num_base_bdevs_discovered": 1, 00:14:35.262 "num_base_bdevs_operational": 3, 00:14:35.262 "base_bdevs_list": [ 00:14:35.262 { 00:14:35.262 "name": "BaseBdev1", 00:14:35.262 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:35.262 "is_configured": true, 00:14:35.262 "data_offset": 2048, 00:14:35.262 "data_size": 63488 00:14:35.262 }, 00:14:35.262 { 00:14:35.262 "name": "BaseBdev2", 00:14:35.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.262 "is_configured": false, 00:14:35.262 "data_offset": 0, 00:14:35.262 "data_size": 0 00:14:35.262 }, 00:14:35.262 { 00:14:35.262 "name": "BaseBdev3", 00:14:35.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.262 "is_configured": false, 00:14:35.262 "data_offset": 0, 00:14:35.262 "data_size": 0 00:14:35.262 } 00:14:35.262 ] 00:14:35.262 }' 00:14:35.262 22:21:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.262 22:21:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.830 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:36.088 [2024-07-12 22:21:46.254941] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:36.088 [2024-07-12 22:21:46.254984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dfe310 name Existed_Raid, state configuring 00:14:36.088 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:36.346 [2024-07-12 22:21:46.503635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:36.347 [2024-07-12 22:21:46.505088] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:36.347 [2024-07-12 22:21:46.505121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:36.347 [2024-07-12 22:21:46.505138] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:36.347 [2024-07-12 22:21:46.505149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.347 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.606 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.606 "name": "Existed_Raid", 00:14:36.606 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:36.606 "strip_size_kb": 64, 00:14:36.606 "state": "configuring", 00:14:36.606 "raid_level": "raid0", 00:14:36.606 "superblock": true, 00:14:36.606 "num_base_bdevs": 3, 00:14:36.606 "num_base_bdevs_discovered": 1, 00:14:36.606 "num_base_bdevs_operational": 3, 00:14:36.606 "base_bdevs_list": [ 00:14:36.606 { 00:14:36.606 "name": "BaseBdev1", 00:14:36.606 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:36.606 "is_configured": true, 00:14:36.606 "data_offset": 2048, 00:14:36.606 "data_size": 63488 00:14:36.606 }, 00:14:36.606 { 00:14:36.606 "name": "BaseBdev2", 00:14:36.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.606 "is_configured": false, 00:14:36.606 "data_offset": 0, 00:14:36.606 "data_size": 0 00:14:36.606 }, 00:14:36.606 { 00:14:36.606 "name": "BaseBdev3", 00:14:36.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.606 "is_configured": false, 00:14:36.606 "data_offset": 0, 00:14:36.606 "data_size": 0 00:14:36.606 } 00:14:36.606 ] 00:14:36.606 }' 00:14:36.606 22:21:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.606 22:21:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.173 22:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:37.432 [2024-07-12 22:21:47.590010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.432 BaseBdev2 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:37.432 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:37.691 22:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:37.951 [ 00:14:37.951 { 00:14:37.951 "name": "BaseBdev2", 00:14:37.951 "aliases": [ 00:14:37.951 "2e7f9e44-a0df-46e3-afee-4e7ecea0b292" 00:14:37.951 ], 00:14:37.951 "product_name": "Malloc disk", 00:14:37.951 "block_size": 512, 00:14:37.951 "num_blocks": 65536, 00:14:37.951 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:37.951 "assigned_rate_limits": { 00:14:37.951 "rw_ios_per_sec": 0, 00:14:37.951 "rw_mbytes_per_sec": 0, 00:14:37.951 "r_mbytes_per_sec": 0, 00:14:37.951 "w_mbytes_per_sec": 0 00:14:37.951 }, 00:14:37.951 "claimed": true, 00:14:37.951 "claim_type": "exclusive_write", 00:14:37.951 "zoned": false, 00:14:37.951 "supported_io_types": { 00:14:37.951 "read": true, 00:14:37.951 "write": true, 00:14:37.951 "unmap": true, 00:14:37.951 "flush": true, 00:14:37.951 "reset": true, 00:14:37.951 "nvme_admin": false, 00:14:37.951 "nvme_io": false, 00:14:37.951 "nvme_io_md": false, 00:14:37.951 "write_zeroes": true, 00:14:37.951 "zcopy": true, 00:14:37.951 "get_zone_info": false, 00:14:37.951 "zone_management": false, 00:14:37.951 "zone_append": false, 00:14:37.951 "compare": false, 00:14:37.951 "compare_and_write": false, 00:14:37.951 "abort": true, 00:14:37.951 "seek_hole": false, 00:14:37.951 "seek_data": false, 00:14:37.951 "copy": true, 00:14:37.951 "nvme_iov_md": false 00:14:37.951 }, 00:14:37.951 "memory_domains": [ 00:14:37.951 { 00:14:37.951 "dma_device_id": "system", 00:14:37.951 "dma_device_type": 1 00:14:37.951 }, 00:14:37.951 { 00:14:37.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.951 "dma_device_type": 2 00:14:37.951 } 00:14:37.951 ], 00:14:37.951 "driver_specific": {} 00:14:37.951 } 00:14:37.951 ] 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.951 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.210 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.210 "name": "Existed_Raid", 00:14:38.210 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:38.210 "strip_size_kb": 64, 00:14:38.210 "state": "configuring", 00:14:38.210 "raid_level": "raid0", 00:14:38.210 "superblock": true, 00:14:38.210 "num_base_bdevs": 3, 00:14:38.210 "num_base_bdevs_discovered": 2, 00:14:38.210 "num_base_bdevs_operational": 3, 00:14:38.210 "base_bdevs_list": [ 00:14:38.210 { 00:14:38.210 "name": "BaseBdev1", 00:14:38.210 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:38.210 "is_configured": true, 00:14:38.210 "data_offset": 2048, 00:14:38.210 "data_size": 63488 00:14:38.210 }, 00:14:38.210 { 00:14:38.210 "name": "BaseBdev2", 00:14:38.210 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:38.210 "is_configured": true, 00:14:38.210 "data_offset": 2048, 00:14:38.210 "data_size": 63488 00:14:38.210 }, 00:14:38.210 { 00:14:38.210 "name": "BaseBdev3", 00:14:38.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.210 "is_configured": false, 00:14:38.210 "data_offset": 0, 00:14:38.210 "data_size": 0 00:14:38.210 } 00:14:38.210 ] 00:14:38.210 }' 00:14:38.210 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.210 22:21:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:38.778 22:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:39.036 [2024-07-12 22:21:49.177618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:39.036 [2024-07-12 22:21:49.177787] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dff400 00:14:39.036 [2024-07-12 22:21:49.177802] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:39.036 [2024-07-12 22:21:49.177982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dfeef0 00:14:39.036 [2024-07-12 22:21:49.178099] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dff400 00:14:39.036 [2024-07-12 22:21:49.178109] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1dff400 00:14:39.036 [2024-07-12 22:21:49.178200] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.036 BaseBdev3 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.036 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.295 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:39.554 [ 00:14:39.554 { 00:14:39.554 "name": "BaseBdev3", 00:14:39.554 "aliases": [ 00:14:39.554 "240af21c-8669-42df-ba69-b77ac9549406" 00:14:39.554 ], 00:14:39.554 "product_name": "Malloc disk", 00:14:39.554 "block_size": 512, 00:14:39.554 "num_blocks": 65536, 00:14:39.554 "uuid": "240af21c-8669-42df-ba69-b77ac9549406", 00:14:39.554 "assigned_rate_limits": { 00:14:39.554 "rw_ios_per_sec": 0, 00:14:39.554 "rw_mbytes_per_sec": 0, 00:14:39.554 "r_mbytes_per_sec": 0, 00:14:39.554 "w_mbytes_per_sec": 0 00:14:39.554 }, 00:14:39.554 "claimed": true, 00:14:39.554 "claim_type": "exclusive_write", 00:14:39.554 "zoned": false, 00:14:39.554 "supported_io_types": { 00:14:39.554 "read": true, 00:14:39.554 "write": true, 00:14:39.554 "unmap": true, 00:14:39.554 "flush": true, 00:14:39.554 "reset": true, 00:14:39.554 "nvme_admin": false, 00:14:39.554 "nvme_io": false, 00:14:39.554 "nvme_io_md": false, 00:14:39.554 "write_zeroes": true, 00:14:39.554 "zcopy": true, 00:14:39.554 "get_zone_info": false, 00:14:39.554 "zone_management": false, 00:14:39.554 "zone_append": false, 00:14:39.555 "compare": false, 00:14:39.555 "compare_and_write": false, 00:14:39.555 "abort": true, 00:14:39.555 "seek_hole": false, 00:14:39.555 "seek_data": false, 00:14:39.555 "copy": true, 00:14:39.555 "nvme_iov_md": false 00:14:39.555 }, 00:14:39.555 "memory_domains": [ 00:14:39.555 { 00:14:39.555 "dma_device_id": "system", 00:14:39.555 "dma_device_type": 1 00:14:39.555 }, 00:14:39.555 { 00:14:39.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.555 "dma_device_type": 2 00:14:39.555 } 00:14:39.555 ], 00:14:39.555 "driver_specific": {} 00:14:39.555 } 00:14:39.555 ] 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.555 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.813 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.813 "name": "Existed_Raid", 00:14:39.813 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:39.813 "strip_size_kb": 64, 00:14:39.813 "state": "online", 00:14:39.813 "raid_level": "raid0", 00:14:39.813 "superblock": true, 00:14:39.813 "num_base_bdevs": 3, 00:14:39.813 "num_base_bdevs_discovered": 3, 00:14:39.813 "num_base_bdevs_operational": 3, 00:14:39.813 "base_bdevs_list": [ 00:14:39.813 { 00:14:39.813 "name": "BaseBdev1", 00:14:39.813 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:39.813 "is_configured": true, 00:14:39.813 "data_offset": 2048, 00:14:39.813 "data_size": 63488 00:14:39.813 }, 00:14:39.813 { 00:14:39.813 "name": "BaseBdev2", 00:14:39.813 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:39.813 "is_configured": true, 00:14:39.813 "data_offset": 2048, 00:14:39.813 "data_size": 63488 00:14:39.813 }, 00:14:39.813 { 00:14:39.813 "name": "BaseBdev3", 00:14:39.813 "uuid": "240af21c-8669-42df-ba69-b77ac9549406", 00:14:39.813 "is_configured": true, 00:14:39.813 "data_offset": 2048, 00:14:39.813 "data_size": 63488 00:14:39.813 } 00:14:39.813 ] 00:14:39.813 }' 00:14:39.813 22:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.813 22:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.381 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:40.641 [2024-07-12 22:21:50.758336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.641 "name": "Existed_Raid", 00:14:40.641 "aliases": [ 00:14:40.641 "87cfe4d8-3448-423a-9d18-cdaa095436e0" 00:14:40.641 ], 00:14:40.641 "product_name": "Raid Volume", 00:14:40.641 "block_size": 512, 00:14:40.641 "num_blocks": 190464, 00:14:40.641 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:40.641 "assigned_rate_limits": { 00:14:40.641 "rw_ios_per_sec": 0, 00:14:40.641 "rw_mbytes_per_sec": 0, 00:14:40.641 "r_mbytes_per_sec": 0, 00:14:40.641 "w_mbytes_per_sec": 0 00:14:40.641 }, 00:14:40.641 "claimed": false, 00:14:40.641 "zoned": false, 00:14:40.641 "supported_io_types": { 00:14:40.641 "read": true, 00:14:40.641 "write": true, 00:14:40.641 "unmap": true, 00:14:40.641 "flush": true, 00:14:40.641 "reset": true, 00:14:40.641 "nvme_admin": false, 00:14:40.641 "nvme_io": false, 00:14:40.641 "nvme_io_md": false, 00:14:40.641 "write_zeroes": true, 00:14:40.641 "zcopy": false, 00:14:40.641 "get_zone_info": false, 00:14:40.641 "zone_management": false, 00:14:40.641 "zone_append": false, 00:14:40.641 "compare": false, 00:14:40.641 "compare_and_write": false, 00:14:40.641 "abort": false, 00:14:40.641 "seek_hole": false, 00:14:40.641 "seek_data": false, 00:14:40.641 "copy": false, 00:14:40.641 "nvme_iov_md": false 00:14:40.641 }, 00:14:40.641 "memory_domains": [ 00:14:40.641 { 00:14:40.641 "dma_device_id": "system", 00:14:40.641 "dma_device_type": 1 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.641 "dma_device_type": 2 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "dma_device_id": "system", 00:14:40.641 "dma_device_type": 1 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.641 "dma_device_type": 2 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "dma_device_id": "system", 00:14:40.641 "dma_device_type": 1 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.641 "dma_device_type": 2 00:14:40.641 } 00:14:40.641 ], 00:14:40.641 "driver_specific": { 00:14:40.641 "raid": { 00:14:40.641 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:40.641 "strip_size_kb": 64, 00:14:40.641 "state": "online", 00:14:40.641 "raid_level": "raid0", 00:14:40.641 "superblock": true, 00:14:40.641 "num_base_bdevs": 3, 00:14:40.641 "num_base_bdevs_discovered": 3, 00:14:40.641 "num_base_bdevs_operational": 3, 00:14:40.641 "base_bdevs_list": [ 00:14:40.641 { 00:14:40.641 "name": "BaseBdev1", 00:14:40.641 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:40.641 "is_configured": true, 00:14:40.641 "data_offset": 2048, 00:14:40.641 "data_size": 63488 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "name": "BaseBdev2", 00:14:40.641 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:40.641 "is_configured": true, 00:14:40.641 "data_offset": 2048, 00:14:40.641 "data_size": 63488 00:14:40.641 }, 00:14:40.641 { 00:14:40.641 "name": "BaseBdev3", 00:14:40.641 "uuid": "240af21c-8669-42df-ba69-b77ac9549406", 00:14:40.641 "is_configured": true, 00:14:40.641 "data_offset": 2048, 00:14:40.641 "data_size": 63488 00:14:40.641 } 00:14:40.641 ] 00:14:40.641 } 00:14:40.641 } 00:14:40.641 }' 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:40.641 BaseBdev2 00:14:40.641 BaseBdev3' 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:40.641 22:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.901 "name": "BaseBdev1", 00:14:40.901 "aliases": [ 00:14:40.901 "22bcef33-eff3-4a52-9d13-eaf66f629331" 00:14:40.901 ], 00:14:40.901 "product_name": "Malloc disk", 00:14:40.901 "block_size": 512, 00:14:40.901 "num_blocks": 65536, 00:14:40.901 "uuid": "22bcef33-eff3-4a52-9d13-eaf66f629331", 00:14:40.901 "assigned_rate_limits": { 00:14:40.901 "rw_ios_per_sec": 0, 00:14:40.901 "rw_mbytes_per_sec": 0, 00:14:40.901 "r_mbytes_per_sec": 0, 00:14:40.901 "w_mbytes_per_sec": 0 00:14:40.901 }, 00:14:40.901 "claimed": true, 00:14:40.901 "claim_type": "exclusive_write", 00:14:40.901 "zoned": false, 00:14:40.901 "supported_io_types": { 00:14:40.901 "read": true, 00:14:40.901 "write": true, 00:14:40.901 "unmap": true, 00:14:40.901 "flush": true, 00:14:40.901 "reset": true, 00:14:40.901 "nvme_admin": false, 00:14:40.901 "nvme_io": false, 00:14:40.901 "nvme_io_md": false, 00:14:40.901 "write_zeroes": true, 00:14:40.901 "zcopy": true, 00:14:40.901 "get_zone_info": false, 00:14:40.901 "zone_management": false, 00:14:40.901 "zone_append": false, 00:14:40.901 "compare": false, 00:14:40.901 "compare_and_write": false, 00:14:40.901 "abort": true, 00:14:40.901 "seek_hole": false, 00:14:40.901 "seek_data": false, 00:14:40.901 "copy": true, 00:14:40.901 "nvme_iov_md": false 00:14:40.901 }, 00:14:40.901 "memory_domains": [ 00:14:40.901 { 00:14:40.901 "dma_device_id": "system", 00:14:40.901 "dma_device_type": 1 00:14:40.901 }, 00:14:40.901 { 00:14:40.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.901 "dma_device_type": 2 00:14:40.901 } 00:14:40.901 ], 00:14:40.901 "driver_specific": {} 00:14:40.901 }' 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.901 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.160 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.419 "name": "BaseBdev2", 00:14:41.419 "aliases": [ 00:14:41.419 "2e7f9e44-a0df-46e3-afee-4e7ecea0b292" 00:14:41.419 ], 00:14:41.419 "product_name": "Malloc disk", 00:14:41.419 "block_size": 512, 00:14:41.419 "num_blocks": 65536, 00:14:41.419 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:41.419 "assigned_rate_limits": { 00:14:41.419 "rw_ios_per_sec": 0, 00:14:41.419 "rw_mbytes_per_sec": 0, 00:14:41.419 "r_mbytes_per_sec": 0, 00:14:41.419 "w_mbytes_per_sec": 0 00:14:41.419 }, 00:14:41.419 "claimed": true, 00:14:41.419 "claim_type": "exclusive_write", 00:14:41.419 "zoned": false, 00:14:41.419 "supported_io_types": { 00:14:41.419 "read": true, 00:14:41.419 "write": true, 00:14:41.419 "unmap": true, 00:14:41.419 "flush": true, 00:14:41.419 "reset": true, 00:14:41.419 "nvme_admin": false, 00:14:41.419 "nvme_io": false, 00:14:41.419 "nvme_io_md": false, 00:14:41.419 "write_zeroes": true, 00:14:41.419 "zcopy": true, 00:14:41.419 "get_zone_info": false, 00:14:41.419 "zone_management": false, 00:14:41.419 "zone_append": false, 00:14:41.419 "compare": false, 00:14:41.419 "compare_and_write": false, 00:14:41.419 "abort": true, 00:14:41.419 "seek_hole": false, 00:14:41.419 "seek_data": false, 00:14:41.419 "copy": true, 00:14:41.419 "nvme_iov_md": false 00:14:41.419 }, 00:14:41.419 "memory_domains": [ 00:14:41.419 { 00:14:41.419 "dma_device_id": "system", 00:14:41.419 "dma_device_type": 1 00:14:41.419 }, 00:14:41.419 { 00:14:41.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.419 "dma_device_type": 2 00:14:41.419 } 00:14:41.419 ], 00:14:41.419 "driver_specific": {} 00:14:41.419 }' 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.419 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.678 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.679 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.679 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:41.679 22:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.938 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.938 "name": "BaseBdev3", 00:14:41.938 "aliases": [ 00:14:41.938 "240af21c-8669-42df-ba69-b77ac9549406" 00:14:41.938 ], 00:14:41.938 "product_name": "Malloc disk", 00:14:41.938 "block_size": 512, 00:14:41.938 "num_blocks": 65536, 00:14:41.938 "uuid": "240af21c-8669-42df-ba69-b77ac9549406", 00:14:41.938 "assigned_rate_limits": { 00:14:41.938 "rw_ios_per_sec": 0, 00:14:41.938 "rw_mbytes_per_sec": 0, 00:14:41.938 "r_mbytes_per_sec": 0, 00:14:41.938 "w_mbytes_per_sec": 0 00:14:41.938 }, 00:14:41.938 "claimed": true, 00:14:41.938 "claim_type": "exclusive_write", 00:14:41.938 "zoned": false, 00:14:41.938 "supported_io_types": { 00:14:41.938 "read": true, 00:14:41.938 "write": true, 00:14:41.938 "unmap": true, 00:14:41.938 "flush": true, 00:14:41.938 "reset": true, 00:14:41.938 "nvme_admin": false, 00:14:41.938 "nvme_io": false, 00:14:41.938 "nvme_io_md": false, 00:14:41.938 "write_zeroes": true, 00:14:41.938 "zcopy": true, 00:14:41.938 "get_zone_info": false, 00:14:41.938 "zone_management": false, 00:14:41.938 "zone_append": false, 00:14:41.938 "compare": false, 00:14:41.938 "compare_and_write": false, 00:14:41.938 "abort": true, 00:14:41.938 "seek_hole": false, 00:14:41.938 "seek_data": false, 00:14:41.938 "copy": true, 00:14:41.938 "nvme_iov_md": false 00:14:41.938 }, 00:14:41.938 "memory_domains": [ 00:14:41.938 { 00:14:41.938 "dma_device_id": "system", 00:14:41.938 "dma_device_type": 1 00:14:41.938 }, 00:14:41.938 { 00:14:41.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.938 "dma_device_type": 2 00:14:41.938 } 00:14:41.938 ], 00:14:41.938 "driver_specific": {} 00:14:41.938 }' 00:14:41.938 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.938 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.196 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.456 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.456 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:42.731 [2024-07-12 22:21:52.787482] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:42.731 [2024-07-12 22:21:52.787512] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:42.731 [2024-07-12 22:21:52.787559] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.731 22:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.731 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.731 "name": "Existed_Raid", 00:14:42.731 "uuid": "87cfe4d8-3448-423a-9d18-cdaa095436e0", 00:14:42.731 "strip_size_kb": 64, 00:14:42.731 "state": "offline", 00:14:42.731 "raid_level": "raid0", 00:14:42.731 "superblock": true, 00:14:42.731 "num_base_bdevs": 3, 00:14:42.731 "num_base_bdevs_discovered": 2, 00:14:42.731 "num_base_bdevs_operational": 2, 00:14:42.731 "base_bdevs_list": [ 00:14:42.731 { 00:14:42.731 "name": null, 00:14:42.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.731 "is_configured": false, 00:14:42.731 "data_offset": 2048, 00:14:42.731 "data_size": 63488 00:14:42.731 }, 00:14:42.731 { 00:14:42.731 "name": "BaseBdev2", 00:14:42.731 "uuid": "2e7f9e44-a0df-46e3-afee-4e7ecea0b292", 00:14:42.731 "is_configured": true, 00:14:42.731 "data_offset": 2048, 00:14:42.731 "data_size": 63488 00:14:42.731 }, 00:14:42.731 { 00:14:42.731 "name": "BaseBdev3", 00:14:42.731 "uuid": "240af21c-8669-42df-ba69-b77ac9549406", 00:14:42.731 "is_configured": true, 00:14:42.731 "data_offset": 2048, 00:14:42.731 "data_size": 63488 00:14:42.731 } 00:14:42.731 ] 00:14:42.731 }' 00:14:42.731 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.731 22:21:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:43.668 22:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:43.927 [2024-07-12 22:21:54.128973] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:43.927 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.927 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.927 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.927 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:44.187 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:44.187 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:44.187 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:44.446 [2024-07-12 22:21:54.541361] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:44.446 [2024-07-12 22:21:54.541413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dff400 name Existed_Raid, state offline 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:44.446 22:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:45.014 BaseBdev2 00:14:45.014 22:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:45.014 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:45.014 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.014 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:45.015 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.015 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.015 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.274 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:45.534 [ 00:14:45.534 { 00:14:45.534 "name": "BaseBdev2", 00:14:45.534 "aliases": [ 00:14:45.534 "6732e6e4-5d07-4aa7-bf97-586c47c64fa3" 00:14:45.534 ], 00:14:45.534 "product_name": "Malloc disk", 00:14:45.534 "block_size": 512, 00:14:45.534 "num_blocks": 65536, 00:14:45.534 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:45.534 "assigned_rate_limits": { 00:14:45.534 "rw_ios_per_sec": 0, 00:14:45.534 "rw_mbytes_per_sec": 0, 00:14:45.534 "r_mbytes_per_sec": 0, 00:14:45.534 "w_mbytes_per_sec": 0 00:14:45.534 }, 00:14:45.534 "claimed": false, 00:14:45.534 "zoned": false, 00:14:45.534 "supported_io_types": { 00:14:45.534 "read": true, 00:14:45.534 "write": true, 00:14:45.534 "unmap": true, 00:14:45.534 "flush": true, 00:14:45.534 "reset": true, 00:14:45.534 "nvme_admin": false, 00:14:45.534 "nvme_io": false, 00:14:45.534 "nvme_io_md": false, 00:14:45.534 "write_zeroes": true, 00:14:45.534 "zcopy": true, 00:14:45.534 "get_zone_info": false, 00:14:45.534 "zone_management": false, 00:14:45.534 "zone_append": false, 00:14:45.534 "compare": false, 00:14:45.534 "compare_and_write": false, 00:14:45.534 "abort": true, 00:14:45.534 "seek_hole": false, 00:14:45.534 "seek_data": false, 00:14:45.534 "copy": true, 00:14:45.534 "nvme_iov_md": false 00:14:45.534 }, 00:14:45.534 "memory_domains": [ 00:14:45.534 { 00:14:45.534 "dma_device_id": "system", 00:14:45.534 "dma_device_type": 1 00:14:45.534 }, 00:14:45.534 { 00:14:45.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.534 "dma_device_type": 2 00:14:45.534 } 00:14:45.534 ], 00:14:45.534 "driver_specific": {} 00:14:45.534 } 00:14:45.534 ] 00:14:45.534 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:45.534 22:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.534 22:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.534 22:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:45.793 BaseBdev3 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.793 22:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:46.053 22:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:46.313 [ 00:14:46.313 { 00:14:46.313 "name": "BaseBdev3", 00:14:46.313 "aliases": [ 00:14:46.313 "4d4e0396-e8ed-4a39-9df3-fa5a6920984d" 00:14:46.313 ], 00:14:46.313 "product_name": "Malloc disk", 00:14:46.313 "block_size": 512, 00:14:46.313 "num_blocks": 65536, 00:14:46.313 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:46.313 "assigned_rate_limits": { 00:14:46.313 "rw_ios_per_sec": 0, 00:14:46.313 "rw_mbytes_per_sec": 0, 00:14:46.313 "r_mbytes_per_sec": 0, 00:14:46.313 "w_mbytes_per_sec": 0 00:14:46.313 }, 00:14:46.313 "claimed": false, 00:14:46.313 "zoned": false, 00:14:46.313 "supported_io_types": { 00:14:46.313 "read": true, 00:14:46.313 "write": true, 00:14:46.313 "unmap": true, 00:14:46.313 "flush": true, 00:14:46.313 "reset": true, 00:14:46.313 "nvme_admin": false, 00:14:46.313 "nvme_io": false, 00:14:46.313 "nvme_io_md": false, 00:14:46.313 "write_zeroes": true, 00:14:46.313 "zcopy": true, 00:14:46.313 "get_zone_info": false, 00:14:46.313 "zone_management": false, 00:14:46.313 "zone_append": false, 00:14:46.313 "compare": false, 00:14:46.313 "compare_and_write": false, 00:14:46.313 "abort": true, 00:14:46.313 "seek_hole": false, 00:14:46.313 "seek_data": false, 00:14:46.313 "copy": true, 00:14:46.313 "nvme_iov_md": false 00:14:46.313 }, 00:14:46.313 "memory_domains": [ 00:14:46.313 { 00:14:46.313 "dma_device_id": "system", 00:14:46.313 "dma_device_type": 1 00:14:46.313 }, 00:14:46.313 { 00:14:46.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.313 "dma_device_type": 2 00:14:46.313 } 00:14:46.313 ], 00:14:46.313 "driver_specific": {} 00:14:46.313 } 00:14:46.313 ] 00:14:46.313 22:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:46.313 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:46.313 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:46.313 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:46.572 [2024-07-12 22:21:56.669604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:46.572 [2024-07-12 22:21:56.669650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:46.572 [2024-07-12 22:21:56.669672] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:46.573 [2024-07-12 22:21:56.671069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.573 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.832 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.832 "name": "Existed_Raid", 00:14:46.832 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:46.832 "strip_size_kb": 64, 00:14:46.832 "state": "configuring", 00:14:46.832 "raid_level": "raid0", 00:14:46.832 "superblock": true, 00:14:46.832 "num_base_bdevs": 3, 00:14:46.832 "num_base_bdevs_discovered": 2, 00:14:46.832 "num_base_bdevs_operational": 3, 00:14:46.832 "base_bdevs_list": [ 00:14:46.832 { 00:14:46.832 "name": "BaseBdev1", 00:14:46.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.832 "is_configured": false, 00:14:46.832 "data_offset": 0, 00:14:46.832 "data_size": 0 00:14:46.832 }, 00:14:46.832 { 00:14:46.832 "name": "BaseBdev2", 00:14:46.832 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:46.832 "is_configured": true, 00:14:46.832 "data_offset": 2048, 00:14:46.832 "data_size": 63488 00:14:46.832 }, 00:14:46.832 { 00:14:46.832 "name": "BaseBdev3", 00:14:46.832 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:46.832 "is_configured": true, 00:14:46.832 "data_offset": 2048, 00:14:46.832 "data_size": 63488 00:14:46.832 } 00:14:46.832 ] 00:14:46.832 }' 00:14:46.832 22:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.832 22:21:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.400 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:47.400 [2024-07-12 22:21:57.712341] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.659 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.659 "name": "Existed_Raid", 00:14:47.659 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:47.659 "strip_size_kb": 64, 00:14:47.659 "state": "configuring", 00:14:47.659 "raid_level": "raid0", 00:14:47.659 "superblock": true, 00:14:47.659 "num_base_bdevs": 3, 00:14:47.659 "num_base_bdevs_discovered": 1, 00:14:47.659 "num_base_bdevs_operational": 3, 00:14:47.659 "base_bdevs_list": [ 00:14:47.659 { 00:14:47.659 "name": "BaseBdev1", 00:14:47.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.659 "is_configured": false, 00:14:47.659 "data_offset": 0, 00:14:47.660 "data_size": 0 00:14:47.660 }, 00:14:47.660 { 00:14:47.660 "name": null, 00:14:47.660 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:47.660 "is_configured": false, 00:14:47.660 "data_offset": 2048, 00:14:47.660 "data_size": 63488 00:14:47.660 }, 00:14:47.660 { 00:14:47.660 "name": "BaseBdev3", 00:14:47.660 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:47.660 "is_configured": true, 00:14:47.660 "data_offset": 2048, 00:14:47.660 "data_size": 63488 00:14:47.660 } 00:14:47.660 ] 00:14:47.660 }' 00:14:47.660 22:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.660 22:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.226 22:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.226 22:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:48.486 22:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:48.486 22:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.745 [2024-07-12 22:21:58.980337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.745 BaseBdev1 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.745 22:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:49.005 22:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:49.264 [ 00:14:49.264 { 00:14:49.264 "name": "BaseBdev1", 00:14:49.264 "aliases": [ 00:14:49.264 "512abbea-ac18-448f-8077-72791f59389f" 00:14:49.264 ], 00:14:49.264 "product_name": "Malloc disk", 00:14:49.264 "block_size": 512, 00:14:49.264 "num_blocks": 65536, 00:14:49.264 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:49.264 "assigned_rate_limits": { 00:14:49.264 "rw_ios_per_sec": 0, 00:14:49.264 "rw_mbytes_per_sec": 0, 00:14:49.264 "r_mbytes_per_sec": 0, 00:14:49.264 "w_mbytes_per_sec": 0 00:14:49.264 }, 00:14:49.264 "claimed": true, 00:14:49.264 "claim_type": "exclusive_write", 00:14:49.264 "zoned": false, 00:14:49.264 "supported_io_types": { 00:14:49.264 "read": true, 00:14:49.264 "write": true, 00:14:49.264 "unmap": true, 00:14:49.264 "flush": true, 00:14:49.264 "reset": true, 00:14:49.264 "nvme_admin": false, 00:14:49.264 "nvme_io": false, 00:14:49.264 "nvme_io_md": false, 00:14:49.264 "write_zeroes": true, 00:14:49.264 "zcopy": true, 00:14:49.264 "get_zone_info": false, 00:14:49.264 "zone_management": false, 00:14:49.264 "zone_append": false, 00:14:49.264 "compare": false, 00:14:49.264 "compare_and_write": false, 00:14:49.264 "abort": true, 00:14:49.264 "seek_hole": false, 00:14:49.264 "seek_data": false, 00:14:49.264 "copy": true, 00:14:49.264 "nvme_iov_md": false 00:14:49.264 }, 00:14:49.264 "memory_domains": [ 00:14:49.264 { 00:14:49.264 "dma_device_id": "system", 00:14:49.264 "dma_device_type": 1 00:14:49.264 }, 00:14:49.264 { 00:14:49.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.264 "dma_device_type": 2 00:14:49.264 } 00:14:49.264 ], 00:14:49.264 "driver_specific": {} 00:14:49.264 } 00:14:49.264 ] 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.264 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.832 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.832 "name": "Existed_Raid", 00:14:49.832 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:49.832 "strip_size_kb": 64, 00:14:49.832 "state": "configuring", 00:14:49.832 "raid_level": "raid0", 00:14:49.832 "superblock": true, 00:14:49.832 "num_base_bdevs": 3, 00:14:49.832 "num_base_bdevs_discovered": 2, 00:14:49.832 "num_base_bdevs_operational": 3, 00:14:49.832 "base_bdevs_list": [ 00:14:49.832 { 00:14:49.832 "name": "BaseBdev1", 00:14:49.832 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:49.832 "is_configured": true, 00:14:49.832 "data_offset": 2048, 00:14:49.832 "data_size": 63488 00:14:49.832 }, 00:14:49.832 { 00:14:49.832 "name": null, 00:14:49.832 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:49.832 "is_configured": false, 00:14:49.832 "data_offset": 2048, 00:14:49.832 "data_size": 63488 00:14:49.832 }, 00:14:49.832 { 00:14:49.832 "name": "BaseBdev3", 00:14:49.832 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:49.832 "is_configured": true, 00:14:49.832 "data_offset": 2048, 00:14:49.832 "data_size": 63488 00:14:49.832 } 00:14:49.832 ] 00:14:49.832 }' 00:14:49.832 22:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.832 22:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.400 22:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.400 22:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:50.968 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:50.968 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:50.968 [2024-07-12 22:22:01.230343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.969 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.227 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.227 "name": "Existed_Raid", 00:14:51.227 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:51.227 "strip_size_kb": 64, 00:14:51.227 "state": "configuring", 00:14:51.227 "raid_level": "raid0", 00:14:51.227 "superblock": true, 00:14:51.227 "num_base_bdevs": 3, 00:14:51.227 "num_base_bdevs_discovered": 1, 00:14:51.227 "num_base_bdevs_operational": 3, 00:14:51.227 "base_bdevs_list": [ 00:14:51.227 { 00:14:51.227 "name": "BaseBdev1", 00:14:51.227 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:51.227 "is_configured": true, 00:14:51.227 "data_offset": 2048, 00:14:51.227 "data_size": 63488 00:14:51.227 }, 00:14:51.227 { 00:14:51.227 "name": null, 00:14:51.227 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:51.227 "is_configured": false, 00:14:51.227 "data_offset": 2048, 00:14:51.227 "data_size": 63488 00:14:51.227 }, 00:14:51.227 { 00:14:51.227 "name": null, 00:14:51.227 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:51.227 "is_configured": false, 00:14:51.227 "data_offset": 2048, 00:14:51.227 "data_size": 63488 00:14:51.228 } 00:14:51.228 ] 00:14:51.228 }' 00:14:51.228 22:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.228 22:22:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:51.798 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.798 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:52.057 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:52.057 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:52.315 [2024-07-12 22:22:02.453605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.315 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.574 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.574 "name": "Existed_Raid", 00:14:52.574 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:52.574 "strip_size_kb": 64, 00:14:52.574 "state": "configuring", 00:14:52.574 "raid_level": "raid0", 00:14:52.574 "superblock": true, 00:14:52.574 "num_base_bdevs": 3, 00:14:52.574 "num_base_bdevs_discovered": 2, 00:14:52.574 "num_base_bdevs_operational": 3, 00:14:52.574 "base_bdevs_list": [ 00:14:52.574 { 00:14:52.574 "name": "BaseBdev1", 00:14:52.574 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:52.574 "is_configured": true, 00:14:52.574 "data_offset": 2048, 00:14:52.574 "data_size": 63488 00:14:52.574 }, 00:14:52.574 { 00:14:52.574 "name": null, 00:14:52.574 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:52.574 "is_configured": false, 00:14:52.574 "data_offset": 2048, 00:14:52.574 "data_size": 63488 00:14:52.574 }, 00:14:52.574 { 00:14:52.574 "name": "BaseBdev3", 00:14:52.574 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:52.574 "is_configured": true, 00:14:52.574 "data_offset": 2048, 00:14:52.574 "data_size": 63488 00:14:52.574 } 00:14:52.574 ] 00:14:52.574 }' 00:14:52.574 22:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.574 22:22:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:53.141 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.141 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:53.399 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:53.399 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:53.658 [2024-07-12 22:22:03.769102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.658 22:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.248 22:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.248 "name": "Existed_Raid", 00:14:54.248 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:54.248 "strip_size_kb": 64, 00:14:54.248 "state": "configuring", 00:14:54.248 "raid_level": "raid0", 00:14:54.248 "superblock": true, 00:14:54.248 "num_base_bdevs": 3, 00:14:54.248 "num_base_bdevs_discovered": 1, 00:14:54.248 "num_base_bdevs_operational": 3, 00:14:54.248 "base_bdevs_list": [ 00:14:54.248 { 00:14:54.248 "name": null, 00:14:54.248 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:54.248 "is_configured": false, 00:14:54.248 "data_offset": 2048, 00:14:54.248 "data_size": 63488 00:14:54.248 }, 00:14:54.248 { 00:14:54.248 "name": null, 00:14:54.248 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:54.248 "is_configured": false, 00:14:54.248 "data_offset": 2048, 00:14:54.248 "data_size": 63488 00:14:54.248 }, 00:14:54.248 { 00:14:54.248 "name": "BaseBdev3", 00:14:54.248 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:54.248 "is_configured": true, 00:14:54.248 "data_offset": 2048, 00:14:54.248 "data_size": 63488 00:14:54.248 } 00:14:54.248 ] 00:14:54.248 }' 00:14:54.248 22:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.248 22:22:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:54.816 22:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.816 22:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:54.816 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:54.816 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:55.075 [2024-07-12 22:22:05.305647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.075 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.334 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.334 "name": "Existed_Raid", 00:14:55.334 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:55.334 "strip_size_kb": 64, 00:14:55.334 "state": "configuring", 00:14:55.334 "raid_level": "raid0", 00:14:55.334 "superblock": true, 00:14:55.334 "num_base_bdevs": 3, 00:14:55.334 "num_base_bdevs_discovered": 2, 00:14:55.334 "num_base_bdevs_operational": 3, 00:14:55.334 "base_bdevs_list": [ 00:14:55.334 { 00:14:55.334 "name": null, 00:14:55.334 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:55.334 "is_configured": false, 00:14:55.334 "data_offset": 2048, 00:14:55.334 "data_size": 63488 00:14:55.334 }, 00:14:55.334 { 00:14:55.334 "name": "BaseBdev2", 00:14:55.334 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:55.334 "is_configured": true, 00:14:55.334 "data_offset": 2048, 00:14:55.334 "data_size": 63488 00:14:55.334 }, 00:14:55.334 { 00:14:55.334 "name": "BaseBdev3", 00:14:55.334 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:55.334 "is_configured": true, 00:14:55.334 "data_offset": 2048, 00:14:55.334 "data_size": 63488 00:14:55.334 } 00:14:55.334 ] 00:14:55.334 }' 00:14:55.334 22:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.334 22:22:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:55.900 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.900 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:56.160 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:56.160 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.160 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:56.419 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 512abbea-ac18-448f-8077-72791f59389f 00:14:56.678 [2024-07-12 22:22:06.874440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:56.678 [2024-07-12 22:22:06.874606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dfde90 00:14:56.678 [2024-07-12 22:22:06.874619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:56.678 [2024-07-12 22:22:06.874798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b04940 00:14:56.678 [2024-07-12 22:22:06.874911] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dfde90 00:14:56.678 [2024-07-12 22:22:06.874921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1dfde90 00:14:56.678 [2024-07-12 22:22:06.875027] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.678 NewBaseBdev 00:14:56.678 22:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:56.678 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:56.678 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:56.678 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:56.679 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:56.679 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:56.679 22:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:56.937 22:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:57.195 [ 00:14:57.195 { 00:14:57.196 "name": "NewBaseBdev", 00:14:57.196 "aliases": [ 00:14:57.196 "512abbea-ac18-448f-8077-72791f59389f" 00:14:57.196 ], 00:14:57.196 "product_name": "Malloc disk", 00:14:57.196 "block_size": 512, 00:14:57.196 "num_blocks": 65536, 00:14:57.196 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:57.196 "assigned_rate_limits": { 00:14:57.196 "rw_ios_per_sec": 0, 00:14:57.196 "rw_mbytes_per_sec": 0, 00:14:57.196 "r_mbytes_per_sec": 0, 00:14:57.196 "w_mbytes_per_sec": 0 00:14:57.196 }, 00:14:57.196 "claimed": true, 00:14:57.196 "claim_type": "exclusive_write", 00:14:57.196 "zoned": false, 00:14:57.196 "supported_io_types": { 00:14:57.196 "read": true, 00:14:57.196 "write": true, 00:14:57.196 "unmap": true, 00:14:57.196 "flush": true, 00:14:57.196 "reset": true, 00:14:57.196 "nvme_admin": false, 00:14:57.196 "nvme_io": false, 00:14:57.196 "nvme_io_md": false, 00:14:57.196 "write_zeroes": true, 00:14:57.196 "zcopy": true, 00:14:57.196 "get_zone_info": false, 00:14:57.196 "zone_management": false, 00:14:57.196 "zone_append": false, 00:14:57.196 "compare": false, 00:14:57.196 "compare_and_write": false, 00:14:57.196 "abort": true, 00:14:57.196 "seek_hole": false, 00:14:57.196 "seek_data": false, 00:14:57.196 "copy": true, 00:14:57.196 "nvme_iov_md": false 00:14:57.196 }, 00:14:57.196 "memory_domains": [ 00:14:57.196 { 00:14:57.196 "dma_device_id": "system", 00:14:57.196 "dma_device_type": 1 00:14:57.196 }, 00:14:57.196 { 00:14:57.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.196 "dma_device_type": 2 00:14:57.196 } 00:14:57.196 ], 00:14:57.196 "driver_specific": {} 00:14:57.196 } 00:14:57.196 ] 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.196 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.481 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.481 "name": "Existed_Raid", 00:14:57.481 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:57.481 "strip_size_kb": 64, 00:14:57.481 "state": "online", 00:14:57.481 "raid_level": "raid0", 00:14:57.481 "superblock": true, 00:14:57.481 "num_base_bdevs": 3, 00:14:57.482 "num_base_bdevs_discovered": 3, 00:14:57.482 "num_base_bdevs_operational": 3, 00:14:57.482 "base_bdevs_list": [ 00:14:57.482 { 00:14:57.482 "name": "NewBaseBdev", 00:14:57.482 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:57.482 "is_configured": true, 00:14:57.482 "data_offset": 2048, 00:14:57.482 "data_size": 63488 00:14:57.482 }, 00:14:57.482 { 00:14:57.482 "name": "BaseBdev2", 00:14:57.482 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:57.482 "is_configured": true, 00:14:57.482 "data_offset": 2048, 00:14:57.482 "data_size": 63488 00:14:57.482 }, 00:14:57.482 { 00:14:57.482 "name": "BaseBdev3", 00:14:57.482 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:57.482 "is_configured": true, 00:14:57.482 "data_offset": 2048, 00:14:57.482 "data_size": 63488 00:14:57.482 } 00:14:57.482 ] 00:14:57.482 }' 00:14:57.482 22:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.482 22:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:58.048 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:58.306 [2024-07-12 22:22:08.447190] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:58.306 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:58.306 "name": "Existed_Raid", 00:14:58.306 "aliases": [ 00:14:58.306 "e0050d2d-2668-4098-a63c-3d25aefdc597" 00:14:58.306 ], 00:14:58.306 "product_name": "Raid Volume", 00:14:58.306 "block_size": 512, 00:14:58.306 "num_blocks": 190464, 00:14:58.306 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:58.306 "assigned_rate_limits": { 00:14:58.306 "rw_ios_per_sec": 0, 00:14:58.306 "rw_mbytes_per_sec": 0, 00:14:58.306 "r_mbytes_per_sec": 0, 00:14:58.306 "w_mbytes_per_sec": 0 00:14:58.306 }, 00:14:58.306 "claimed": false, 00:14:58.306 "zoned": false, 00:14:58.306 "supported_io_types": { 00:14:58.306 "read": true, 00:14:58.306 "write": true, 00:14:58.306 "unmap": true, 00:14:58.306 "flush": true, 00:14:58.306 "reset": true, 00:14:58.306 "nvme_admin": false, 00:14:58.306 "nvme_io": false, 00:14:58.306 "nvme_io_md": false, 00:14:58.306 "write_zeroes": true, 00:14:58.306 "zcopy": false, 00:14:58.306 "get_zone_info": false, 00:14:58.306 "zone_management": false, 00:14:58.306 "zone_append": false, 00:14:58.306 "compare": false, 00:14:58.306 "compare_and_write": false, 00:14:58.306 "abort": false, 00:14:58.306 "seek_hole": false, 00:14:58.306 "seek_data": false, 00:14:58.306 "copy": false, 00:14:58.306 "nvme_iov_md": false 00:14:58.306 }, 00:14:58.306 "memory_domains": [ 00:14:58.306 { 00:14:58.306 "dma_device_id": "system", 00:14:58.306 "dma_device_type": 1 00:14:58.306 }, 00:14:58.306 { 00:14:58.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.306 "dma_device_type": 2 00:14:58.306 }, 00:14:58.306 { 00:14:58.306 "dma_device_id": "system", 00:14:58.306 "dma_device_type": 1 00:14:58.306 }, 00:14:58.306 { 00:14:58.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.306 "dma_device_type": 2 00:14:58.306 }, 00:14:58.306 { 00:14:58.306 "dma_device_id": "system", 00:14:58.306 "dma_device_type": 1 00:14:58.306 }, 00:14:58.306 { 00:14:58.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.306 "dma_device_type": 2 00:14:58.306 } 00:14:58.306 ], 00:14:58.306 "driver_specific": { 00:14:58.306 "raid": { 00:14:58.307 "uuid": "e0050d2d-2668-4098-a63c-3d25aefdc597", 00:14:58.307 "strip_size_kb": 64, 00:14:58.307 "state": "online", 00:14:58.307 "raid_level": "raid0", 00:14:58.307 "superblock": true, 00:14:58.307 "num_base_bdevs": 3, 00:14:58.307 "num_base_bdevs_discovered": 3, 00:14:58.307 "num_base_bdevs_operational": 3, 00:14:58.307 "base_bdevs_list": [ 00:14:58.307 { 00:14:58.307 "name": "NewBaseBdev", 00:14:58.307 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:58.307 "is_configured": true, 00:14:58.307 "data_offset": 2048, 00:14:58.307 "data_size": 63488 00:14:58.307 }, 00:14:58.307 { 00:14:58.307 "name": "BaseBdev2", 00:14:58.307 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:58.307 "is_configured": true, 00:14:58.307 "data_offset": 2048, 00:14:58.307 "data_size": 63488 00:14:58.307 }, 00:14:58.307 { 00:14:58.307 "name": "BaseBdev3", 00:14:58.307 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:58.307 "is_configured": true, 00:14:58.307 "data_offset": 2048, 00:14:58.307 "data_size": 63488 00:14:58.307 } 00:14:58.307 ] 00:14:58.307 } 00:14:58.307 } 00:14:58.307 }' 00:14:58.307 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:58.307 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:58.307 BaseBdev2 00:14:58.307 BaseBdev3' 00:14:58.307 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.307 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:58.307 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.564 "name": "NewBaseBdev", 00:14:58.564 "aliases": [ 00:14:58.564 "512abbea-ac18-448f-8077-72791f59389f" 00:14:58.564 ], 00:14:58.564 "product_name": "Malloc disk", 00:14:58.564 "block_size": 512, 00:14:58.564 "num_blocks": 65536, 00:14:58.564 "uuid": "512abbea-ac18-448f-8077-72791f59389f", 00:14:58.564 "assigned_rate_limits": { 00:14:58.564 "rw_ios_per_sec": 0, 00:14:58.564 "rw_mbytes_per_sec": 0, 00:14:58.564 "r_mbytes_per_sec": 0, 00:14:58.564 "w_mbytes_per_sec": 0 00:14:58.564 }, 00:14:58.564 "claimed": true, 00:14:58.564 "claim_type": "exclusive_write", 00:14:58.564 "zoned": false, 00:14:58.564 "supported_io_types": { 00:14:58.564 "read": true, 00:14:58.564 "write": true, 00:14:58.564 "unmap": true, 00:14:58.564 "flush": true, 00:14:58.564 "reset": true, 00:14:58.564 "nvme_admin": false, 00:14:58.564 "nvme_io": false, 00:14:58.564 "nvme_io_md": false, 00:14:58.564 "write_zeroes": true, 00:14:58.564 "zcopy": true, 00:14:58.564 "get_zone_info": false, 00:14:58.564 "zone_management": false, 00:14:58.564 "zone_append": false, 00:14:58.564 "compare": false, 00:14:58.564 "compare_and_write": false, 00:14:58.564 "abort": true, 00:14:58.564 "seek_hole": false, 00:14:58.564 "seek_data": false, 00:14:58.564 "copy": true, 00:14:58.564 "nvme_iov_md": false 00:14:58.564 }, 00:14:58.564 "memory_domains": [ 00:14:58.564 { 00:14:58.564 "dma_device_id": "system", 00:14:58.564 "dma_device_type": 1 00:14:58.564 }, 00:14:58.564 { 00:14:58.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.564 "dma_device_type": 2 00:14:58.564 } 00:14:58.564 ], 00:14:58.564 "driver_specific": {} 00:14:58.564 }' 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.564 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.822 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.822 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.822 22:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:58.822 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.080 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.080 "name": "BaseBdev2", 00:14:59.080 "aliases": [ 00:14:59.080 "6732e6e4-5d07-4aa7-bf97-586c47c64fa3" 00:14:59.080 ], 00:14:59.080 "product_name": "Malloc disk", 00:14:59.080 "block_size": 512, 00:14:59.080 "num_blocks": 65536, 00:14:59.080 "uuid": "6732e6e4-5d07-4aa7-bf97-586c47c64fa3", 00:14:59.080 "assigned_rate_limits": { 00:14:59.080 "rw_ios_per_sec": 0, 00:14:59.080 "rw_mbytes_per_sec": 0, 00:14:59.080 "r_mbytes_per_sec": 0, 00:14:59.080 "w_mbytes_per_sec": 0 00:14:59.080 }, 00:14:59.080 "claimed": true, 00:14:59.080 "claim_type": "exclusive_write", 00:14:59.080 "zoned": false, 00:14:59.080 "supported_io_types": { 00:14:59.080 "read": true, 00:14:59.080 "write": true, 00:14:59.080 "unmap": true, 00:14:59.080 "flush": true, 00:14:59.080 "reset": true, 00:14:59.080 "nvme_admin": false, 00:14:59.080 "nvme_io": false, 00:14:59.080 "nvme_io_md": false, 00:14:59.080 "write_zeroes": true, 00:14:59.080 "zcopy": true, 00:14:59.080 "get_zone_info": false, 00:14:59.080 "zone_management": false, 00:14:59.080 "zone_append": false, 00:14:59.080 "compare": false, 00:14:59.080 "compare_and_write": false, 00:14:59.080 "abort": true, 00:14:59.080 "seek_hole": false, 00:14:59.080 "seek_data": false, 00:14:59.080 "copy": true, 00:14:59.080 "nvme_iov_md": false 00:14:59.080 }, 00:14:59.080 "memory_domains": [ 00:14:59.080 { 00:14:59.080 "dma_device_id": "system", 00:14:59.080 "dma_device_type": 1 00:14:59.080 }, 00:14:59.080 { 00:14:59.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.080 "dma_device_type": 2 00:14:59.080 } 00:14:59.080 ], 00:14:59.080 "driver_specific": {} 00:14:59.080 }' 00:14:59.080 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.080 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.337 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.595 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.595 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.595 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:59.595 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.852 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.852 "name": "BaseBdev3", 00:14:59.852 "aliases": [ 00:14:59.852 "4d4e0396-e8ed-4a39-9df3-fa5a6920984d" 00:14:59.852 ], 00:14:59.852 "product_name": "Malloc disk", 00:14:59.852 "block_size": 512, 00:14:59.852 "num_blocks": 65536, 00:14:59.852 "uuid": "4d4e0396-e8ed-4a39-9df3-fa5a6920984d", 00:14:59.852 "assigned_rate_limits": { 00:14:59.852 "rw_ios_per_sec": 0, 00:14:59.852 "rw_mbytes_per_sec": 0, 00:14:59.852 "r_mbytes_per_sec": 0, 00:14:59.852 "w_mbytes_per_sec": 0 00:14:59.852 }, 00:14:59.852 "claimed": true, 00:14:59.852 "claim_type": "exclusive_write", 00:14:59.852 "zoned": false, 00:14:59.852 "supported_io_types": { 00:14:59.852 "read": true, 00:14:59.852 "write": true, 00:14:59.852 "unmap": true, 00:14:59.852 "flush": true, 00:14:59.852 "reset": true, 00:14:59.852 "nvme_admin": false, 00:14:59.852 "nvme_io": false, 00:14:59.852 "nvme_io_md": false, 00:14:59.852 "write_zeroes": true, 00:14:59.852 "zcopy": true, 00:14:59.852 "get_zone_info": false, 00:14:59.852 "zone_management": false, 00:14:59.852 "zone_append": false, 00:14:59.852 "compare": false, 00:14:59.852 "compare_and_write": false, 00:14:59.852 "abort": true, 00:14:59.852 "seek_hole": false, 00:14:59.852 "seek_data": false, 00:14:59.852 "copy": true, 00:14:59.852 "nvme_iov_md": false 00:14:59.852 }, 00:14:59.852 "memory_domains": [ 00:14:59.852 { 00:14:59.852 "dma_device_id": "system", 00:14:59.852 "dma_device_type": 1 00:14:59.852 }, 00:14:59.852 { 00:14:59.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.852 "dma_device_type": 2 00:14:59.852 } 00:14:59.852 ], 00:14:59.852 "driver_specific": {} 00:14:59.852 }' 00:14:59.852 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.852 22:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.852 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.109 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.109 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.109 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.109 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.109 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:00.368 [2024-07-12 22:22:10.460248] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:00.368 [2024-07-12 22:22:10.460277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:00.368 [2024-07-12 22:22:10.460341] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:00.368 [2024-07-12 22:22:10.460395] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:00.368 [2024-07-12 22:22:10.460414] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dfde90 name Existed_Raid, state offline 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3446248 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3446248 ']' 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3446248 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3446248 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3446248' 00:15:00.368 killing process with pid 3446248 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3446248 00:15:00.368 [2024-07-12 22:22:10.528831] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:00.368 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3446248 00:15:00.368 [2024-07-12 22:22:10.555619] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:00.627 22:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:00.627 00:15:00.627 real 0m28.788s 00:15:00.627 user 0m52.942s 00:15:00.627 sys 0m5.077s 00:15:00.627 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:00.627 22:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.627 ************************************ 00:15:00.627 END TEST raid_state_function_test_sb 00:15:00.627 ************************************ 00:15:00.627 22:22:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:00.627 22:22:10 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:15:00.627 22:22:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:00.627 22:22:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:00.627 22:22:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:00.627 ************************************ 00:15:00.627 START TEST raid_superblock_test 00:15:00.627 ************************************ 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3450556 00:15:00.627 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3450556 /var/tmp/spdk-raid.sock 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3450556 ']' 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:00.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:00.628 22:22:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.628 [2024-07-12 22:22:10.921363] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:15:00.628 [2024-07-12 22:22:10.921416] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3450556 ] 00:15:00.887 [2024-07-12 22:22:11.033455] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:00.887 [2024-07-12 22:22:11.134058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:00.887 [2024-07-12 22:22:11.198163] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.887 [2024-07-12 22:22:11.198203] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:01.824 malloc1 00:15:01.824 22:22:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:01.824 [2024-07-12 22:22:12.110693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:01.824 [2024-07-12 22:22:12.110739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.824 [2024-07-12 22:22:12.110758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed2570 00:15:01.824 [2024-07-12 22:22:12.110770] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.824 [2024-07-12 22:22:12.112337] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.824 [2024-07-12 22:22:12.112366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:01.824 pt1 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:01.824 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:02.083 malloc2 00:15:02.083 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:02.342 [2024-07-12 22:22:12.464384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:02.342 [2024-07-12 22:22:12.464431] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.342 [2024-07-12 22:22:12.464449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed3970 00:15:02.342 [2024-07-12 22:22:12.464461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.342 [2024-07-12 22:22:12.465887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.342 [2024-07-12 22:22:12.465914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:02.342 pt2 00:15:02.342 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:02.342 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:02.343 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:02.343 malloc3 00:15:02.602 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:02.602 [2024-07-12 22:22:12.833952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:02.602 [2024-07-12 22:22:12.833998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.602 [2024-07-12 22:22:12.834014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x106a340 00:15:02.602 [2024-07-12 22:22:12.834027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.603 [2024-07-12 22:22:12.835415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.603 [2024-07-12 22:22:12.835441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:02.603 pt3 00:15:02.603 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:02.603 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:02.603 22:22:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:02.860 [2024-07-12 22:22:13.010434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:02.860 [2024-07-12 22:22:13.011595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:02.860 [2024-07-12 22:22:13.011648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:02.860 [2024-07-12 22:22:13.011791] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xecaea0 00:15:02.861 [2024-07-12 22:22:13.011803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:02.861 [2024-07-12 22:22:13.011994] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xed2240 00:15:02.861 [2024-07-12 22:22:13.012128] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xecaea0 00:15:02.861 [2024-07-12 22:22:13.012139] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xecaea0 00:15:02.861 [2024-07-12 22:22:13.012228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.861 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.119 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.119 "name": "raid_bdev1", 00:15:03.119 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:03.119 "strip_size_kb": 64, 00:15:03.119 "state": "online", 00:15:03.119 "raid_level": "raid0", 00:15:03.119 "superblock": true, 00:15:03.119 "num_base_bdevs": 3, 00:15:03.119 "num_base_bdevs_discovered": 3, 00:15:03.119 "num_base_bdevs_operational": 3, 00:15:03.119 "base_bdevs_list": [ 00:15:03.119 { 00:15:03.119 "name": "pt1", 00:15:03.119 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.119 "is_configured": true, 00:15:03.119 "data_offset": 2048, 00:15:03.119 "data_size": 63488 00:15:03.119 }, 00:15:03.119 { 00:15:03.119 "name": "pt2", 00:15:03.119 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.119 "is_configured": true, 00:15:03.119 "data_offset": 2048, 00:15:03.119 "data_size": 63488 00:15:03.119 }, 00:15:03.119 { 00:15:03.119 "name": "pt3", 00:15:03.119 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.119 "is_configured": true, 00:15:03.119 "data_offset": 2048, 00:15:03.119 "data_size": 63488 00:15:03.119 } 00:15:03.119 ] 00:15:03.119 }' 00:15:03.119 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.119 22:22:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:03.687 22:22:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:03.947 [2024-07-12 22:22:14.029413] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:03.947 "name": "raid_bdev1", 00:15:03.947 "aliases": [ 00:15:03.947 "af7dbacd-ea36-4388-9b62-b22ee4276783" 00:15:03.947 ], 00:15:03.947 "product_name": "Raid Volume", 00:15:03.947 "block_size": 512, 00:15:03.947 "num_blocks": 190464, 00:15:03.947 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:03.947 "assigned_rate_limits": { 00:15:03.947 "rw_ios_per_sec": 0, 00:15:03.947 "rw_mbytes_per_sec": 0, 00:15:03.947 "r_mbytes_per_sec": 0, 00:15:03.947 "w_mbytes_per_sec": 0 00:15:03.947 }, 00:15:03.947 "claimed": false, 00:15:03.947 "zoned": false, 00:15:03.947 "supported_io_types": { 00:15:03.947 "read": true, 00:15:03.947 "write": true, 00:15:03.947 "unmap": true, 00:15:03.947 "flush": true, 00:15:03.947 "reset": true, 00:15:03.947 "nvme_admin": false, 00:15:03.947 "nvme_io": false, 00:15:03.947 "nvme_io_md": false, 00:15:03.947 "write_zeroes": true, 00:15:03.947 "zcopy": false, 00:15:03.947 "get_zone_info": false, 00:15:03.947 "zone_management": false, 00:15:03.947 "zone_append": false, 00:15:03.947 "compare": false, 00:15:03.947 "compare_and_write": false, 00:15:03.947 "abort": false, 00:15:03.947 "seek_hole": false, 00:15:03.947 "seek_data": false, 00:15:03.947 "copy": false, 00:15:03.947 "nvme_iov_md": false 00:15:03.947 }, 00:15:03.947 "memory_domains": [ 00:15:03.947 { 00:15:03.947 "dma_device_id": "system", 00:15:03.947 "dma_device_type": 1 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.947 "dma_device_type": 2 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "dma_device_id": "system", 00:15:03.947 "dma_device_type": 1 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.947 "dma_device_type": 2 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "dma_device_id": "system", 00:15:03.947 "dma_device_type": 1 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.947 "dma_device_type": 2 00:15:03.947 } 00:15:03.947 ], 00:15:03.947 "driver_specific": { 00:15:03.947 "raid": { 00:15:03.947 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:03.947 "strip_size_kb": 64, 00:15:03.947 "state": "online", 00:15:03.947 "raid_level": "raid0", 00:15:03.947 "superblock": true, 00:15:03.947 "num_base_bdevs": 3, 00:15:03.947 "num_base_bdevs_discovered": 3, 00:15:03.947 "num_base_bdevs_operational": 3, 00:15:03.947 "base_bdevs_list": [ 00:15:03.947 { 00:15:03.947 "name": "pt1", 00:15:03.947 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.947 "is_configured": true, 00:15:03.947 "data_offset": 2048, 00:15:03.947 "data_size": 63488 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "name": "pt2", 00:15:03.947 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.947 "is_configured": true, 00:15:03.947 "data_offset": 2048, 00:15:03.947 "data_size": 63488 00:15:03.947 }, 00:15:03.947 { 00:15:03.947 "name": "pt3", 00:15:03.947 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.947 "is_configured": true, 00:15:03.947 "data_offset": 2048, 00:15:03.947 "data_size": 63488 00:15:03.947 } 00:15:03.947 ] 00:15:03.947 } 00:15:03.947 } 00:15:03.947 }' 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:03.947 pt2 00:15:03.947 pt3' 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.947 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:04.206 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.206 "name": "pt1", 00:15:04.206 "aliases": [ 00:15:04.206 "00000000-0000-0000-0000-000000000001" 00:15:04.206 ], 00:15:04.206 "product_name": "passthru", 00:15:04.206 "block_size": 512, 00:15:04.206 "num_blocks": 65536, 00:15:04.206 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:04.206 "assigned_rate_limits": { 00:15:04.206 "rw_ios_per_sec": 0, 00:15:04.206 "rw_mbytes_per_sec": 0, 00:15:04.206 "r_mbytes_per_sec": 0, 00:15:04.206 "w_mbytes_per_sec": 0 00:15:04.206 }, 00:15:04.206 "claimed": true, 00:15:04.206 "claim_type": "exclusive_write", 00:15:04.206 "zoned": false, 00:15:04.206 "supported_io_types": { 00:15:04.206 "read": true, 00:15:04.206 "write": true, 00:15:04.206 "unmap": true, 00:15:04.206 "flush": true, 00:15:04.206 "reset": true, 00:15:04.206 "nvme_admin": false, 00:15:04.206 "nvme_io": false, 00:15:04.206 "nvme_io_md": false, 00:15:04.206 "write_zeroes": true, 00:15:04.206 "zcopy": true, 00:15:04.206 "get_zone_info": false, 00:15:04.206 "zone_management": false, 00:15:04.206 "zone_append": false, 00:15:04.206 "compare": false, 00:15:04.206 "compare_and_write": false, 00:15:04.206 "abort": true, 00:15:04.206 "seek_hole": false, 00:15:04.206 "seek_data": false, 00:15:04.206 "copy": true, 00:15:04.206 "nvme_iov_md": false 00:15:04.206 }, 00:15:04.206 "memory_domains": [ 00:15:04.206 { 00:15:04.206 "dma_device_id": "system", 00:15:04.206 "dma_device_type": 1 00:15:04.207 }, 00:15:04.207 { 00:15:04.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.207 "dma_device_type": 2 00:15:04.207 } 00:15:04.207 ], 00:15:04.207 "driver_specific": { 00:15:04.207 "passthru": { 00:15:04.207 "name": "pt1", 00:15:04.207 "base_bdev_name": "malloc1" 00:15:04.207 } 00:15:04.207 } 00:15:04.207 }' 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.207 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:04.467 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.726 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.726 "name": "pt2", 00:15:04.726 "aliases": [ 00:15:04.726 "00000000-0000-0000-0000-000000000002" 00:15:04.726 ], 00:15:04.726 "product_name": "passthru", 00:15:04.726 "block_size": 512, 00:15:04.726 "num_blocks": 65536, 00:15:04.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:04.726 "assigned_rate_limits": { 00:15:04.726 "rw_ios_per_sec": 0, 00:15:04.726 "rw_mbytes_per_sec": 0, 00:15:04.726 "r_mbytes_per_sec": 0, 00:15:04.726 "w_mbytes_per_sec": 0 00:15:04.726 }, 00:15:04.726 "claimed": true, 00:15:04.726 "claim_type": "exclusive_write", 00:15:04.726 "zoned": false, 00:15:04.726 "supported_io_types": { 00:15:04.726 "read": true, 00:15:04.726 "write": true, 00:15:04.726 "unmap": true, 00:15:04.726 "flush": true, 00:15:04.726 "reset": true, 00:15:04.726 "nvme_admin": false, 00:15:04.726 "nvme_io": false, 00:15:04.726 "nvme_io_md": false, 00:15:04.726 "write_zeroes": true, 00:15:04.726 "zcopy": true, 00:15:04.726 "get_zone_info": false, 00:15:04.726 "zone_management": false, 00:15:04.726 "zone_append": false, 00:15:04.726 "compare": false, 00:15:04.726 "compare_and_write": false, 00:15:04.726 "abort": true, 00:15:04.726 "seek_hole": false, 00:15:04.726 "seek_data": false, 00:15:04.726 "copy": true, 00:15:04.726 "nvme_iov_md": false 00:15:04.726 }, 00:15:04.726 "memory_domains": [ 00:15:04.726 { 00:15:04.726 "dma_device_id": "system", 00:15:04.726 "dma_device_type": 1 00:15:04.726 }, 00:15:04.726 { 00:15:04.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.726 "dma_device_type": 2 00:15:04.726 } 00:15:04.726 ], 00:15:04.726 "driver_specific": { 00:15:04.726 "passthru": { 00:15:04.726 "name": "pt2", 00:15:04.726 "base_bdev_name": "malloc2" 00:15:04.726 } 00:15:04.726 } 00:15:04.726 }' 00:15:04.726 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.726 22:22:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.726 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.726 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:04.985 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.244 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.244 "name": "pt3", 00:15:05.244 "aliases": [ 00:15:05.244 "00000000-0000-0000-0000-000000000003" 00:15:05.244 ], 00:15:05.244 "product_name": "passthru", 00:15:05.244 "block_size": 512, 00:15:05.244 "num_blocks": 65536, 00:15:05.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:05.244 "assigned_rate_limits": { 00:15:05.244 "rw_ios_per_sec": 0, 00:15:05.244 "rw_mbytes_per_sec": 0, 00:15:05.244 "r_mbytes_per_sec": 0, 00:15:05.244 "w_mbytes_per_sec": 0 00:15:05.244 }, 00:15:05.244 "claimed": true, 00:15:05.244 "claim_type": "exclusive_write", 00:15:05.244 "zoned": false, 00:15:05.244 "supported_io_types": { 00:15:05.244 "read": true, 00:15:05.244 "write": true, 00:15:05.244 "unmap": true, 00:15:05.244 "flush": true, 00:15:05.244 "reset": true, 00:15:05.244 "nvme_admin": false, 00:15:05.244 "nvme_io": false, 00:15:05.244 "nvme_io_md": false, 00:15:05.244 "write_zeroes": true, 00:15:05.244 "zcopy": true, 00:15:05.244 "get_zone_info": false, 00:15:05.244 "zone_management": false, 00:15:05.244 "zone_append": false, 00:15:05.244 "compare": false, 00:15:05.244 "compare_and_write": false, 00:15:05.244 "abort": true, 00:15:05.244 "seek_hole": false, 00:15:05.244 "seek_data": false, 00:15:05.244 "copy": true, 00:15:05.244 "nvme_iov_md": false 00:15:05.244 }, 00:15:05.244 "memory_domains": [ 00:15:05.244 { 00:15:05.244 "dma_device_id": "system", 00:15:05.244 "dma_device_type": 1 00:15:05.244 }, 00:15:05.244 { 00:15:05.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.244 "dma_device_type": 2 00:15:05.244 } 00:15:05.244 ], 00:15:05.244 "driver_specific": { 00:15:05.244 "passthru": { 00:15:05.244 "name": "pt3", 00:15:05.244 "base_bdev_name": "malloc3" 00:15:05.244 } 00:15:05.244 } 00:15:05.244 }' 00:15:05.244 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.244 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:05.504 22:22:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:05.763 [2024-07-12 22:22:16.038738] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:05.763 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=af7dbacd-ea36-4388-9b62-b22ee4276783 00:15:05.763 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z af7dbacd-ea36-4388-9b62-b22ee4276783 ']' 00:15:05.763 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:06.022 [2024-07-12 22:22:16.283095] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:06.022 [2024-07-12 22:22:16.283120] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.022 [2024-07-12 22:22:16.283173] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.023 [2024-07-12 22:22:16.283228] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.023 [2024-07-12 22:22:16.283246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xecaea0 name raid_bdev1, state offline 00:15:06.023 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.023 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:06.281 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:06.281 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:06.281 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:06.281 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:06.540 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:06.540 22:22:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:06.799 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:06.799 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:07.059 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:07.059 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:07.318 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:07.578 [2024-07-12 22:22:17.754938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:07.578 [2024-07-12 22:22:17.756346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:07.578 [2024-07-12 22:22:17.756391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:07.578 [2024-07-12 22:22:17.756448] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:07.578 [2024-07-12 22:22:17.756489] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:07.578 [2024-07-12 22:22:17.756512] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:07.578 [2024-07-12 22:22:17.756539] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:07.578 [2024-07-12 22:22:17.756549] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1075ff0 name raid_bdev1, state configuring 00:15:07.578 request: 00:15:07.578 { 00:15:07.578 "name": "raid_bdev1", 00:15:07.578 "raid_level": "raid0", 00:15:07.578 "base_bdevs": [ 00:15:07.578 "malloc1", 00:15:07.578 "malloc2", 00:15:07.578 "malloc3" 00:15:07.578 ], 00:15:07.578 "strip_size_kb": 64, 00:15:07.578 "superblock": false, 00:15:07.578 "method": "bdev_raid_create", 00:15:07.578 "req_id": 1 00:15:07.578 } 00:15:07.578 Got JSON-RPC error response 00:15:07.578 response: 00:15:07.578 { 00:15:07.578 "code": -17, 00:15:07.578 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:07.578 } 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.578 22:22:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:07.837 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:07.837 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:07.837 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:08.097 [2024-07-12 22:22:18.248164] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:08.097 [2024-07-12 22:22:18.248216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.097 [2024-07-12 22:22:18.248238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed27a0 00:15:08.097 [2024-07-12 22:22:18.248250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.097 [2024-07-12 22:22:18.249864] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.097 [2024-07-12 22:22:18.249894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:08.097 [2024-07-12 22:22:18.249977] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:08.097 [2024-07-12 22:22:18.250005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:08.097 pt1 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.097 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:08.356 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.356 "name": "raid_bdev1", 00:15:08.356 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:08.356 "strip_size_kb": 64, 00:15:08.356 "state": "configuring", 00:15:08.356 "raid_level": "raid0", 00:15:08.356 "superblock": true, 00:15:08.356 "num_base_bdevs": 3, 00:15:08.356 "num_base_bdevs_discovered": 1, 00:15:08.356 "num_base_bdevs_operational": 3, 00:15:08.356 "base_bdevs_list": [ 00:15:08.356 { 00:15:08.356 "name": "pt1", 00:15:08.356 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:08.356 "is_configured": true, 00:15:08.356 "data_offset": 2048, 00:15:08.356 "data_size": 63488 00:15:08.356 }, 00:15:08.356 { 00:15:08.356 "name": null, 00:15:08.356 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:08.356 "is_configured": false, 00:15:08.356 "data_offset": 2048, 00:15:08.356 "data_size": 63488 00:15:08.356 }, 00:15:08.356 { 00:15:08.356 "name": null, 00:15:08.356 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:08.356 "is_configured": false, 00:15:08.356 "data_offset": 2048, 00:15:08.356 "data_size": 63488 00:15:08.356 } 00:15:08.356 ] 00:15:08.356 }' 00:15:08.356 22:22:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.356 22:22:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.923 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:08.923 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:09.183 [2024-07-12 22:22:19.262868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:09.183 [2024-07-12 22:22:19.262922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.183 [2024-07-12 22:22:19.262946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec9c70 00:15:09.183 [2024-07-12 22:22:19.262959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.183 [2024-07-12 22:22:19.263314] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.183 [2024-07-12 22:22:19.263331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:09.183 [2024-07-12 22:22:19.263398] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:09.183 [2024-07-12 22:22:19.263417] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:09.183 pt2 00:15:09.183 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:09.183 [2024-07-12 22:22:19.507521] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.443 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:09.705 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.705 "name": "raid_bdev1", 00:15:09.705 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:09.705 "strip_size_kb": 64, 00:15:09.705 "state": "configuring", 00:15:09.705 "raid_level": "raid0", 00:15:09.705 "superblock": true, 00:15:09.705 "num_base_bdevs": 3, 00:15:09.705 "num_base_bdevs_discovered": 1, 00:15:09.705 "num_base_bdevs_operational": 3, 00:15:09.705 "base_bdevs_list": [ 00:15:09.705 { 00:15:09.705 "name": "pt1", 00:15:09.705 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:09.705 "is_configured": true, 00:15:09.705 "data_offset": 2048, 00:15:09.705 "data_size": 63488 00:15:09.705 }, 00:15:09.705 { 00:15:09.705 "name": null, 00:15:09.705 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:09.705 "is_configured": false, 00:15:09.705 "data_offset": 2048, 00:15:09.705 "data_size": 63488 00:15:09.705 }, 00:15:09.705 { 00:15:09.705 "name": null, 00:15:09.705 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:09.705 "is_configured": false, 00:15:09.705 "data_offset": 2048, 00:15:09.705 "data_size": 63488 00:15:09.705 } 00:15:09.705 ] 00:15:09.705 }' 00:15:09.705 22:22:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.705 22:22:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:10.361 [2024-07-12 22:22:20.582376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:10.361 [2024-07-12 22:22:20.582433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:10.361 [2024-07-12 22:22:20.582454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x106afa0 00:15:10.361 [2024-07-12 22:22:20.582467] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:10.361 [2024-07-12 22:22:20.582822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:10.361 [2024-07-12 22:22:20.582841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:10.361 [2024-07-12 22:22:20.582910] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:10.361 [2024-07-12 22:22:20.582936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:10.361 pt2 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:10.361 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:10.621 [2024-07-12 22:22:20.762850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:10.621 [2024-07-12 22:22:20.762887] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:10.621 [2024-07-12 22:22:20.762905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x106bb30 00:15:10.621 [2024-07-12 22:22:20.762917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:10.621 [2024-07-12 22:22:20.763224] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:10.621 [2024-07-12 22:22:20.763244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:10.621 [2024-07-12 22:22:20.763298] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:10.621 [2024-07-12 22:22:20.763316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:10.621 [2024-07-12 22:22:20.763422] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x106cc00 00:15:10.621 [2024-07-12 22:22:20.763432] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:10.621 [2024-07-12 22:22:20.763594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10759b0 00:15:10.621 [2024-07-12 22:22:20.763714] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x106cc00 00:15:10.621 [2024-07-12 22:22:20.763724] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x106cc00 00:15:10.621 [2024-07-12 22:22:20.763823] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:10.621 pt3 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.621 22:22:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:10.881 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.881 "name": "raid_bdev1", 00:15:10.881 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:10.881 "strip_size_kb": 64, 00:15:10.881 "state": "online", 00:15:10.881 "raid_level": "raid0", 00:15:10.881 "superblock": true, 00:15:10.881 "num_base_bdevs": 3, 00:15:10.881 "num_base_bdevs_discovered": 3, 00:15:10.881 "num_base_bdevs_operational": 3, 00:15:10.881 "base_bdevs_list": [ 00:15:10.881 { 00:15:10.881 "name": "pt1", 00:15:10.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:10.881 "is_configured": true, 00:15:10.881 "data_offset": 2048, 00:15:10.881 "data_size": 63488 00:15:10.881 }, 00:15:10.881 { 00:15:10.881 "name": "pt2", 00:15:10.881 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:10.881 "is_configured": true, 00:15:10.881 "data_offset": 2048, 00:15:10.881 "data_size": 63488 00:15:10.881 }, 00:15:10.881 { 00:15:10.881 "name": "pt3", 00:15:10.881 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:10.881 "is_configured": true, 00:15:10.881 "data_offset": 2048, 00:15:10.881 "data_size": 63488 00:15:10.881 } 00:15:10.881 ] 00:15:10.881 }' 00:15:10.881 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.881 22:22:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:11.449 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:11.709 [2024-07-12 22:22:21.777825] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:11.709 "name": "raid_bdev1", 00:15:11.709 "aliases": [ 00:15:11.709 "af7dbacd-ea36-4388-9b62-b22ee4276783" 00:15:11.709 ], 00:15:11.709 "product_name": "Raid Volume", 00:15:11.709 "block_size": 512, 00:15:11.709 "num_blocks": 190464, 00:15:11.709 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:11.709 "assigned_rate_limits": { 00:15:11.709 "rw_ios_per_sec": 0, 00:15:11.709 "rw_mbytes_per_sec": 0, 00:15:11.709 "r_mbytes_per_sec": 0, 00:15:11.709 "w_mbytes_per_sec": 0 00:15:11.709 }, 00:15:11.709 "claimed": false, 00:15:11.709 "zoned": false, 00:15:11.709 "supported_io_types": { 00:15:11.709 "read": true, 00:15:11.709 "write": true, 00:15:11.709 "unmap": true, 00:15:11.709 "flush": true, 00:15:11.709 "reset": true, 00:15:11.709 "nvme_admin": false, 00:15:11.709 "nvme_io": false, 00:15:11.709 "nvme_io_md": false, 00:15:11.709 "write_zeroes": true, 00:15:11.709 "zcopy": false, 00:15:11.709 "get_zone_info": false, 00:15:11.709 "zone_management": false, 00:15:11.709 "zone_append": false, 00:15:11.709 "compare": false, 00:15:11.709 "compare_and_write": false, 00:15:11.709 "abort": false, 00:15:11.709 "seek_hole": false, 00:15:11.709 "seek_data": false, 00:15:11.709 "copy": false, 00:15:11.709 "nvme_iov_md": false 00:15:11.709 }, 00:15:11.709 "memory_domains": [ 00:15:11.709 { 00:15:11.709 "dma_device_id": "system", 00:15:11.709 "dma_device_type": 1 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.709 "dma_device_type": 2 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "dma_device_id": "system", 00:15:11.709 "dma_device_type": 1 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.709 "dma_device_type": 2 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "dma_device_id": "system", 00:15:11.709 "dma_device_type": 1 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.709 "dma_device_type": 2 00:15:11.709 } 00:15:11.709 ], 00:15:11.709 "driver_specific": { 00:15:11.709 "raid": { 00:15:11.709 "uuid": "af7dbacd-ea36-4388-9b62-b22ee4276783", 00:15:11.709 "strip_size_kb": 64, 00:15:11.709 "state": "online", 00:15:11.709 "raid_level": "raid0", 00:15:11.709 "superblock": true, 00:15:11.709 "num_base_bdevs": 3, 00:15:11.709 "num_base_bdevs_discovered": 3, 00:15:11.709 "num_base_bdevs_operational": 3, 00:15:11.709 "base_bdevs_list": [ 00:15:11.709 { 00:15:11.709 "name": "pt1", 00:15:11.709 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:11.709 "is_configured": true, 00:15:11.709 "data_offset": 2048, 00:15:11.709 "data_size": 63488 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "name": "pt2", 00:15:11.709 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:11.709 "is_configured": true, 00:15:11.709 "data_offset": 2048, 00:15:11.709 "data_size": 63488 00:15:11.709 }, 00:15:11.709 { 00:15:11.709 "name": "pt3", 00:15:11.709 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:11.709 "is_configured": true, 00:15:11.709 "data_offset": 2048, 00:15:11.709 "data_size": 63488 00:15:11.709 } 00:15:11.709 ] 00:15:11.709 } 00:15:11.709 } 00:15:11.709 }' 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:11.709 pt2 00:15:11.709 pt3' 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:11.709 22:22:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.969 "name": "pt1", 00:15:11.969 "aliases": [ 00:15:11.969 "00000000-0000-0000-0000-000000000001" 00:15:11.969 ], 00:15:11.969 "product_name": "passthru", 00:15:11.969 "block_size": 512, 00:15:11.969 "num_blocks": 65536, 00:15:11.969 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:11.969 "assigned_rate_limits": { 00:15:11.969 "rw_ios_per_sec": 0, 00:15:11.969 "rw_mbytes_per_sec": 0, 00:15:11.969 "r_mbytes_per_sec": 0, 00:15:11.969 "w_mbytes_per_sec": 0 00:15:11.969 }, 00:15:11.969 "claimed": true, 00:15:11.969 "claim_type": "exclusive_write", 00:15:11.969 "zoned": false, 00:15:11.969 "supported_io_types": { 00:15:11.969 "read": true, 00:15:11.969 "write": true, 00:15:11.969 "unmap": true, 00:15:11.969 "flush": true, 00:15:11.969 "reset": true, 00:15:11.969 "nvme_admin": false, 00:15:11.969 "nvme_io": false, 00:15:11.969 "nvme_io_md": false, 00:15:11.969 "write_zeroes": true, 00:15:11.969 "zcopy": true, 00:15:11.969 "get_zone_info": false, 00:15:11.969 "zone_management": false, 00:15:11.969 "zone_append": false, 00:15:11.969 "compare": false, 00:15:11.969 "compare_and_write": false, 00:15:11.969 "abort": true, 00:15:11.969 "seek_hole": false, 00:15:11.969 "seek_data": false, 00:15:11.969 "copy": true, 00:15:11.969 "nvme_iov_md": false 00:15:11.969 }, 00:15:11.969 "memory_domains": [ 00:15:11.969 { 00:15:11.969 "dma_device_id": "system", 00:15:11.969 "dma_device_type": 1 00:15:11.969 }, 00:15:11.969 { 00:15:11.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.969 "dma_device_type": 2 00:15:11.969 } 00:15:11.969 ], 00:15:11.969 "driver_specific": { 00:15:11.969 "passthru": { 00:15:11.969 "name": "pt1", 00:15:11.969 "base_bdev_name": "malloc1" 00:15:11.969 } 00:15:11.969 } 00:15:11.969 }' 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.969 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.236 "name": "pt2", 00:15:12.236 "aliases": [ 00:15:12.236 "00000000-0000-0000-0000-000000000002" 00:15:12.236 ], 00:15:12.236 "product_name": "passthru", 00:15:12.236 "block_size": 512, 00:15:12.236 "num_blocks": 65536, 00:15:12.236 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:12.236 "assigned_rate_limits": { 00:15:12.236 "rw_ios_per_sec": 0, 00:15:12.236 "rw_mbytes_per_sec": 0, 00:15:12.236 "r_mbytes_per_sec": 0, 00:15:12.236 "w_mbytes_per_sec": 0 00:15:12.236 }, 00:15:12.236 "claimed": true, 00:15:12.236 "claim_type": "exclusive_write", 00:15:12.236 "zoned": false, 00:15:12.236 "supported_io_types": { 00:15:12.236 "read": true, 00:15:12.236 "write": true, 00:15:12.236 "unmap": true, 00:15:12.236 "flush": true, 00:15:12.236 "reset": true, 00:15:12.236 "nvme_admin": false, 00:15:12.236 "nvme_io": false, 00:15:12.236 "nvme_io_md": false, 00:15:12.236 "write_zeroes": true, 00:15:12.236 "zcopy": true, 00:15:12.236 "get_zone_info": false, 00:15:12.236 "zone_management": false, 00:15:12.236 "zone_append": false, 00:15:12.236 "compare": false, 00:15:12.236 "compare_and_write": false, 00:15:12.236 "abort": true, 00:15:12.236 "seek_hole": false, 00:15:12.236 "seek_data": false, 00:15:12.236 "copy": true, 00:15:12.236 "nvme_iov_md": false 00:15:12.236 }, 00:15:12.236 "memory_domains": [ 00:15:12.236 { 00:15:12.236 "dma_device_id": "system", 00:15:12.236 "dma_device_type": 1 00:15:12.236 }, 00:15:12.236 { 00:15:12.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.236 "dma_device_type": 2 00:15:12.236 } 00:15:12.236 ], 00:15:12.236 "driver_specific": { 00:15:12.236 "passthru": { 00:15:12.236 "name": "pt2", 00:15:12.236 "base_bdev_name": "malloc2" 00:15:12.236 } 00:15:12.236 } 00:15:12.236 }' 00:15:12.236 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.494 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.495 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.753 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.753 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.753 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.753 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:12.753 22:22:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.753 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.753 "name": "pt3", 00:15:12.753 "aliases": [ 00:15:12.753 "00000000-0000-0000-0000-000000000003" 00:15:12.753 ], 00:15:12.753 "product_name": "passthru", 00:15:12.753 "block_size": 512, 00:15:12.753 "num_blocks": 65536, 00:15:12.753 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:12.753 "assigned_rate_limits": { 00:15:12.753 "rw_ios_per_sec": 0, 00:15:12.753 "rw_mbytes_per_sec": 0, 00:15:12.753 "r_mbytes_per_sec": 0, 00:15:12.753 "w_mbytes_per_sec": 0 00:15:12.753 }, 00:15:12.753 "claimed": true, 00:15:12.753 "claim_type": "exclusive_write", 00:15:12.753 "zoned": false, 00:15:12.753 "supported_io_types": { 00:15:12.753 "read": true, 00:15:12.753 "write": true, 00:15:12.753 "unmap": true, 00:15:12.753 "flush": true, 00:15:12.753 "reset": true, 00:15:12.753 "nvme_admin": false, 00:15:12.753 "nvme_io": false, 00:15:12.753 "nvme_io_md": false, 00:15:12.753 "write_zeroes": true, 00:15:12.753 "zcopy": true, 00:15:12.753 "get_zone_info": false, 00:15:12.753 "zone_management": false, 00:15:12.753 "zone_append": false, 00:15:12.753 "compare": false, 00:15:12.753 "compare_and_write": false, 00:15:12.753 "abort": true, 00:15:12.753 "seek_hole": false, 00:15:12.753 "seek_data": false, 00:15:12.753 "copy": true, 00:15:12.753 "nvme_iov_md": false 00:15:12.753 }, 00:15:12.753 "memory_domains": [ 00:15:12.753 { 00:15:12.753 "dma_device_id": "system", 00:15:12.753 "dma_device_type": 1 00:15:12.753 }, 00:15:12.753 { 00:15:12.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.753 "dma_device_type": 2 00:15:12.753 } 00:15:12.753 ], 00:15:12.753 "driver_specific": { 00:15:12.753 "passthru": { 00:15:12.753 "name": "pt3", 00:15:12.753 "base_bdev_name": "malloc3" 00:15:12.753 } 00:15:12.753 } 00:15:12.753 }' 00:15:12.753 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:13.012 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:13.271 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:13.271 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:13.271 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:13.271 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:13.530 [2024-07-12 22:22:23.638966] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' af7dbacd-ea36-4388-9b62-b22ee4276783 '!=' af7dbacd-ea36-4388-9b62-b22ee4276783 ']' 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3450556 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3450556 ']' 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3450556 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3450556 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3450556' 00:15:13.530 killing process with pid 3450556 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3450556 00:15:13.530 [2024-07-12 22:22:23.700412] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:13.530 [2024-07-12 22:22:23.700467] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:13.530 [2024-07-12 22:22:23.700528] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:13.530 [2024-07-12 22:22:23.700541] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x106cc00 name raid_bdev1, state offline 00:15:13.530 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3450556 00:15:13.530 [2024-07-12 22:22:23.729350] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:13.789 22:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:13.789 00:15:13.789 real 0m13.091s 00:15:13.789 user 0m23.515s 00:15:13.789 sys 0m2.418s 00:15:13.789 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:13.789 22:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.789 ************************************ 00:15:13.789 END TEST raid_superblock_test 00:15:13.789 ************************************ 00:15:13.789 22:22:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:13.789 22:22:23 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:15:13.789 22:22:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:13.789 22:22:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:13.789 22:22:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:13.789 ************************************ 00:15:13.789 START TEST raid_read_error_test 00:15:13.789 ************************************ 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tyuWWG5bQf 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3452604 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3452604 /var/tmp/spdk-raid.sock 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3452604 ']' 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:13.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:13.789 22:22:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.789 [2024-07-12 22:22:24.106155] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:15:13.789 [2024-07-12 22:22:24.106221] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3452604 ] 00:15:14.048 [2024-07-12 22:22:24.224488] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:14.048 [2024-07-12 22:22:24.330997] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:14.306 [2024-07-12 22:22:24.399552] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:14.306 [2024-07-12 22:22:24.399591] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:14.875 22:22:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:14.875 22:22:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:14.875 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:14.875 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:15.134 BaseBdev1_malloc 00:15:15.134 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:15.394 true 00:15:15.394 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:15.652 [2024-07-12 22:22:25.738139] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:15.652 [2024-07-12 22:22:25.738185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.652 [2024-07-12 22:22:25.738206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b950d0 00:15:15.652 [2024-07-12 22:22:25.738219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.652 [2024-07-12 22:22:25.740114] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.652 [2024-07-12 22:22:25.740143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:15.652 BaseBdev1 00:15:15.652 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:15.652 22:22:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:15.910 BaseBdev2_malloc 00:15:15.910 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:16.168 true 00:15:16.168 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:16.168 [2024-07-12 22:22:26.477942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:16.168 [2024-07-12 22:22:26.477987] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:16.168 [2024-07-12 22:22:26.478008] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b99910 00:15:16.168 [2024-07-12 22:22:26.478021] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:16.168 [2024-07-12 22:22:26.479567] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:16.168 [2024-07-12 22:22:26.479596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:16.168 BaseBdev2 00:15:16.427 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:16.427 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:16.427 BaseBdev3_malloc 00:15:16.427 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:16.686 true 00:15:16.686 22:22:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:16.945 [2024-07-12 22:22:27.216728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:16.945 [2024-07-12 22:22:27.216775] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:16.945 [2024-07-12 22:22:27.216797] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9bbd0 00:15:16.945 [2024-07-12 22:22:27.216810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:16.945 [2024-07-12 22:22:27.218425] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:16.945 [2024-07-12 22:22:27.218456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:16.945 BaseBdev3 00:15:16.945 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:17.205 [2024-07-12 22:22:27.457397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:17.205 [2024-07-12 22:22:27.458768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:17.205 [2024-07-12 22:22:27.458839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:17.205 [2024-07-12 22:22:27.459065] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b9d280 00:15:17.205 [2024-07-12 22:22:27.459076] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:17.205 [2024-07-12 22:22:27.459279] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b9ce20 00:15:17.205 [2024-07-12 22:22:27.459429] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b9d280 00:15:17.205 [2024-07-12 22:22:27.459439] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b9d280 00:15:17.205 [2024-07-12 22:22:27.459544] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.205 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:17.464 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.464 "name": "raid_bdev1", 00:15:17.464 "uuid": "fb11158b-3278-48cb-be17-1943b69b2373", 00:15:17.464 "strip_size_kb": 64, 00:15:17.464 "state": "online", 00:15:17.464 "raid_level": "raid0", 00:15:17.464 "superblock": true, 00:15:17.464 "num_base_bdevs": 3, 00:15:17.464 "num_base_bdevs_discovered": 3, 00:15:17.464 "num_base_bdevs_operational": 3, 00:15:17.464 "base_bdevs_list": [ 00:15:17.464 { 00:15:17.464 "name": "BaseBdev1", 00:15:17.464 "uuid": "38fa7e01-c3f1-52cf-aa24-a5f20002557d", 00:15:17.464 "is_configured": true, 00:15:17.464 "data_offset": 2048, 00:15:17.464 "data_size": 63488 00:15:17.464 }, 00:15:17.464 { 00:15:17.464 "name": "BaseBdev2", 00:15:17.464 "uuid": "de72ccd4-3574-573f-9ba0-5b62a3927d53", 00:15:17.464 "is_configured": true, 00:15:17.464 "data_offset": 2048, 00:15:17.464 "data_size": 63488 00:15:17.464 }, 00:15:17.464 { 00:15:17.464 "name": "BaseBdev3", 00:15:17.464 "uuid": "0f2a75b8-94c0-5d26-bbe5-ba6d81299ab5", 00:15:17.464 "is_configured": true, 00:15:17.464 "data_offset": 2048, 00:15:17.464 "data_size": 63488 00:15:17.464 } 00:15:17.464 ] 00:15:17.464 }' 00:15:17.464 22:22:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.464 22:22:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.030 22:22:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:18.030 22:22:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:18.291 [2024-07-12 22:22:28.376106] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19eb5b0 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:19.229 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.488 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.488 "name": "raid_bdev1", 00:15:19.488 "uuid": "fb11158b-3278-48cb-be17-1943b69b2373", 00:15:19.488 "strip_size_kb": 64, 00:15:19.488 "state": "online", 00:15:19.488 "raid_level": "raid0", 00:15:19.488 "superblock": true, 00:15:19.488 "num_base_bdevs": 3, 00:15:19.488 "num_base_bdevs_discovered": 3, 00:15:19.488 "num_base_bdevs_operational": 3, 00:15:19.488 "base_bdevs_list": [ 00:15:19.488 { 00:15:19.488 "name": "BaseBdev1", 00:15:19.488 "uuid": "38fa7e01-c3f1-52cf-aa24-a5f20002557d", 00:15:19.488 "is_configured": true, 00:15:19.488 "data_offset": 2048, 00:15:19.488 "data_size": 63488 00:15:19.488 }, 00:15:19.488 { 00:15:19.488 "name": "BaseBdev2", 00:15:19.488 "uuid": "de72ccd4-3574-573f-9ba0-5b62a3927d53", 00:15:19.488 "is_configured": true, 00:15:19.488 "data_offset": 2048, 00:15:19.488 "data_size": 63488 00:15:19.488 }, 00:15:19.488 { 00:15:19.488 "name": "BaseBdev3", 00:15:19.488 "uuid": "0f2a75b8-94c0-5d26-bbe5-ba6d81299ab5", 00:15:19.488 "is_configured": true, 00:15:19.488 "data_offset": 2048, 00:15:19.488 "data_size": 63488 00:15:19.488 } 00:15:19.488 ] 00:15:19.488 }' 00:15:19.488 22:22:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.488 22:22:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.057 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:20.317 [2024-07-12 22:22:30.404206] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:20.317 [2024-07-12 22:22:30.404244] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:20.317 [2024-07-12 22:22:30.407372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:20.317 [2024-07-12 22:22:30.407411] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.317 [2024-07-12 22:22:30.407447] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:20.317 [2024-07-12 22:22:30.407458] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b9d280 name raid_bdev1, state offline 00:15:20.317 0 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3452604 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3452604 ']' 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3452604 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3452604 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3452604' 00:15:20.317 killing process with pid 3452604 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3452604 00:15:20.317 [2024-07-12 22:22:30.474920] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:20.317 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3452604 00:15:20.317 [2024-07-12 22:22:30.496606] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tyuWWG5bQf 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:20.576 22:22:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:15:20.576 00:15:20.576 real 0m6.702s 00:15:20.576 user 0m10.556s 00:15:20.576 sys 0m1.176s 00:15:20.577 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:20.577 22:22:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.577 ************************************ 00:15:20.577 END TEST raid_read_error_test 00:15:20.577 ************************************ 00:15:20.577 22:22:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:20.577 22:22:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:20.577 22:22:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:20.577 22:22:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:20.577 22:22:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:20.577 ************************************ 00:15:20.577 START TEST raid_write_error_test 00:15:20.577 ************************************ 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.xMnrcCwywY 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3453582 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3453582 /var/tmp/spdk-raid.sock 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3453582 ']' 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:20.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:20.577 22:22:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.577 [2024-07-12 22:22:30.900715] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:15:20.577 [2024-07-12 22:22:30.900790] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3453582 ] 00:15:20.836 [2024-07-12 22:22:31.030642] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:20.836 [2024-07-12 22:22:31.134172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:21.096 [2024-07-12 22:22:31.192816] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.096 [2024-07-12 22:22:31.192845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.665 22:22:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:21.665 22:22:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:21.665 22:22:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:21.665 22:22:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:21.924 BaseBdev1_malloc 00:15:21.924 22:22:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:22.183 true 00:15:22.183 22:22:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:22.442 [2024-07-12 22:22:32.559068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:22.442 [2024-07-12 22:22:32.559113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.442 [2024-07-12 22:22:32.559134] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec10d0 00:15:22.442 [2024-07-12 22:22:32.559147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.442 [2024-07-12 22:22:32.560964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.442 [2024-07-12 22:22:32.560994] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:22.442 BaseBdev1 00:15:22.442 22:22:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:22.442 22:22:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:22.719 BaseBdev2_malloc 00:15:22.719 22:22:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:22.987 true 00:15:22.987 22:22:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:22.987 [2024-07-12 22:22:33.297714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:22.987 [2024-07-12 22:22:33.297759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.987 [2024-07-12 22:22:33.297778] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec5910 00:15:22.987 [2024-07-12 22:22:33.297791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.987 [2024-07-12 22:22:33.299189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.987 [2024-07-12 22:22:33.299217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:22.987 BaseBdev2 00:15:23.246 22:22:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:23.246 22:22:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:23.246 BaseBdev3_malloc 00:15:23.246 22:22:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:23.506 true 00:15:23.506 22:22:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:23.764 [2024-07-12 22:22:34.036271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:23.764 [2024-07-12 22:22:34.036318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:23.764 [2024-07-12 22:22:34.036339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xec7bd0 00:15:23.764 [2024-07-12 22:22:34.036352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:23.764 [2024-07-12 22:22:34.037869] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:23.764 [2024-07-12 22:22:34.037897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:23.764 BaseBdev3 00:15:23.764 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:24.022 [2024-07-12 22:22:34.284966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:24.022 [2024-07-12 22:22:34.286248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.022 [2024-07-12 22:22:34.286317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.022 [2024-07-12 22:22:34.286529] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xec9280 00:15:24.022 [2024-07-12 22:22:34.286541] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:24.022 [2024-07-12 22:22:34.286740] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xec8e20 00:15:24.022 [2024-07-12 22:22:34.286888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xec9280 00:15:24.022 [2024-07-12 22:22:34.286898] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xec9280 00:15:24.022 [2024-07-12 22:22:34.287009] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.022 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:24.022 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:24.022 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.022 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.023 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:24.281 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.281 "name": "raid_bdev1", 00:15:24.281 "uuid": "32e4f82e-0d57-4f5d-bc4d-8a6bdf07adda", 00:15:24.281 "strip_size_kb": 64, 00:15:24.281 "state": "online", 00:15:24.281 "raid_level": "raid0", 00:15:24.281 "superblock": true, 00:15:24.281 "num_base_bdevs": 3, 00:15:24.281 "num_base_bdevs_discovered": 3, 00:15:24.281 "num_base_bdevs_operational": 3, 00:15:24.281 "base_bdevs_list": [ 00:15:24.281 { 00:15:24.281 "name": "BaseBdev1", 00:15:24.281 "uuid": "a9b418e6-7096-51f6-b55e-2f666b5e1f02", 00:15:24.281 "is_configured": true, 00:15:24.281 "data_offset": 2048, 00:15:24.281 "data_size": 63488 00:15:24.281 }, 00:15:24.281 { 00:15:24.281 "name": "BaseBdev2", 00:15:24.281 "uuid": "fcb5738a-7c6e-55eb-aafe-a2d1993cf014", 00:15:24.281 "is_configured": true, 00:15:24.281 "data_offset": 2048, 00:15:24.281 "data_size": 63488 00:15:24.281 }, 00:15:24.281 { 00:15:24.281 "name": "BaseBdev3", 00:15:24.281 "uuid": "abd93cc3-8b52-53a8-8c9e-8d3e962c4fdf", 00:15:24.281 "is_configured": true, 00:15:24.281 "data_offset": 2048, 00:15:24.281 "data_size": 63488 00:15:24.281 } 00:15:24.281 ] 00:15:24.281 }' 00:15:24.281 22:22:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.281 22:22:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.849 22:22:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:24.849 22:22:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:25.109 [2024-07-12 22:22:35.239764] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd175b0 00:15:26.047 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:26.306 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.565 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.565 "name": "raid_bdev1", 00:15:26.565 "uuid": "32e4f82e-0d57-4f5d-bc4d-8a6bdf07adda", 00:15:26.565 "strip_size_kb": 64, 00:15:26.565 "state": "online", 00:15:26.565 "raid_level": "raid0", 00:15:26.565 "superblock": true, 00:15:26.565 "num_base_bdevs": 3, 00:15:26.565 "num_base_bdevs_discovered": 3, 00:15:26.565 "num_base_bdevs_operational": 3, 00:15:26.565 "base_bdevs_list": [ 00:15:26.565 { 00:15:26.565 "name": "BaseBdev1", 00:15:26.565 "uuid": "a9b418e6-7096-51f6-b55e-2f666b5e1f02", 00:15:26.565 "is_configured": true, 00:15:26.565 "data_offset": 2048, 00:15:26.565 "data_size": 63488 00:15:26.565 }, 00:15:26.565 { 00:15:26.565 "name": "BaseBdev2", 00:15:26.565 "uuid": "fcb5738a-7c6e-55eb-aafe-a2d1993cf014", 00:15:26.565 "is_configured": true, 00:15:26.565 "data_offset": 2048, 00:15:26.565 "data_size": 63488 00:15:26.565 }, 00:15:26.565 { 00:15:26.565 "name": "BaseBdev3", 00:15:26.565 "uuid": "abd93cc3-8b52-53a8-8c9e-8d3e962c4fdf", 00:15:26.565 "is_configured": true, 00:15:26.565 "data_offset": 2048, 00:15:26.565 "data_size": 63488 00:15:26.565 } 00:15:26.565 ] 00:15:26.565 }' 00:15:26.565 22:22:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.565 22:22:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.131 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:27.131 [2024-07-12 22:22:37.424684] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:27.131 [2024-07-12 22:22:37.424727] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:27.131 [2024-07-12 22:22:37.427943] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:27.131 [2024-07-12 22:22:37.427983] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:27.131 [2024-07-12 22:22:37.428021] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:27.131 [2024-07-12 22:22:37.428033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xec9280 name raid_bdev1, state offline 00:15:27.131 0 00:15:27.131 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3453582 00:15:27.131 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3453582 ']' 00:15:27.131 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3453582 00:15:27.131 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:27.132 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:27.132 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3453582 00:15:27.389 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:27.389 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:27.389 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3453582' 00:15:27.389 killing process with pid 3453582 00:15:27.389 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3453582 00:15:27.389 [2024-07-12 22:22:37.488981] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:27.389 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3453582 00:15:27.389 [2024-07-12 22:22:37.510046] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.xMnrcCwywY 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:27.646 00:15:27.646 real 0m6.924s 00:15:27.646 user 0m10.947s 00:15:27.646 sys 0m1.180s 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:27.646 22:22:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.646 ************************************ 00:15:27.646 END TEST raid_write_error_test 00:15:27.646 ************************************ 00:15:27.646 22:22:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:27.646 22:22:37 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:27.646 22:22:37 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:27.646 22:22:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:27.646 22:22:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:27.646 22:22:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:27.646 ************************************ 00:15:27.646 START TEST raid_state_function_test 00:15:27.646 ************************************ 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3454568 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3454568' 00:15:27.646 Process raid pid: 3454568 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3454568 /var/tmp/spdk-raid.sock 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3454568 ']' 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:27.646 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:27.646 22:22:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.646 [2024-07-12 22:22:37.903991] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:15:27.646 [2024-07-12 22:22:37.904062] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:27.905 [2024-07-12 22:22:38.038527] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:27.905 [2024-07-12 22:22:38.140991] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:27.905 [2024-07-12 22:22:38.211131] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:27.905 [2024-07-12 22:22:38.211170] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.472 22:22:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:28.472 22:22:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:28.472 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:28.755 [2024-07-12 22:22:38.909850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:28.755 [2024-07-12 22:22:38.909894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:28.755 [2024-07-12 22:22:38.909905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:28.755 [2024-07-12 22:22:38.909917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:28.755 [2024-07-12 22:22:38.909932] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:28.755 [2024-07-12 22:22:38.909944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.755 22:22:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.066 22:22:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.066 "name": "Existed_Raid", 00:15:29.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.066 "strip_size_kb": 64, 00:15:29.066 "state": "configuring", 00:15:29.066 "raid_level": "concat", 00:15:29.066 "superblock": false, 00:15:29.066 "num_base_bdevs": 3, 00:15:29.066 "num_base_bdevs_discovered": 0, 00:15:29.066 "num_base_bdevs_operational": 3, 00:15:29.066 "base_bdevs_list": [ 00:15:29.066 { 00:15:29.066 "name": "BaseBdev1", 00:15:29.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.066 "is_configured": false, 00:15:29.066 "data_offset": 0, 00:15:29.066 "data_size": 0 00:15:29.066 }, 00:15:29.066 { 00:15:29.066 "name": "BaseBdev2", 00:15:29.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.066 "is_configured": false, 00:15:29.066 "data_offset": 0, 00:15:29.066 "data_size": 0 00:15:29.066 }, 00:15:29.066 { 00:15:29.066 "name": "BaseBdev3", 00:15:29.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.067 "is_configured": false, 00:15:29.067 "data_offset": 0, 00:15:29.067 "data_size": 0 00:15:29.067 } 00:15:29.067 ] 00:15:29.067 }' 00:15:29.067 22:22:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.067 22:22:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.635 22:22:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:29.635 [2024-07-12 22:22:39.924397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:29.635 [2024-07-12 22:22:39.924432] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2ca80 name Existed_Raid, state configuring 00:15:29.635 22:22:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:29.894 [2024-07-12 22:22:40.169089] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:29.894 [2024-07-12 22:22:40.169129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:29.894 [2024-07-12 22:22:40.169140] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:29.894 [2024-07-12 22:22:40.169151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:29.894 [2024-07-12 22:22:40.169160] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:29.894 [2024-07-12 22:22:40.169172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:29.894 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:30.153 [2024-07-12 22:22:40.355393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:30.153 BaseBdev1 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:30.153 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.412 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:30.671 [ 00:15:30.671 { 00:15:30.671 "name": "BaseBdev1", 00:15:30.671 "aliases": [ 00:15:30.671 "923cf4b7-ab8c-4072-bd26-424f79e19087" 00:15:30.671 ], 00:15:30.671 "product_name": "Malloc disk", 00:15:30.671 "block_size": 512, 00:15:30.671 "num_blocks": 65536, 00:15:30.671 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:30.671 "assigned_rate_limits": { 00:15:30.671 "rw_ios_per_sec": 0, 00:15:30.671 "rw_mbytes_per_sec": 0, 00:15:30.671 "r_mbytes_per_sec": 0, 00:15:30.671 "w_mbytes_per_sec": 0 00:15:30.671 }, 00:15:30.671 "claimed": true, 00:15:30.671 "claim_type": "exclusive_write", 00:15:30.671 "zoned": false, 00:15:30.671 "supported_io_types": { 00:15:30.671 "read": true, 00:15:30.671 "write": true, 00:15:30.671 "unmap": true, 00:15:30.672 "flush": true, 00:15:30.672 "reset": true, 00:15:30.672 "nvme_admin": false, 00:15:30.672 "nvme_io": false, 00:15:30.672 "nvme_io_md": false, 00:15:30.672 "write_zeroes": true, 00:15:30.672 "zcopy": true, 00:15:30.672 "get_zone_info": false, 00:15:30.672 "zone_management": false, 00:15:30.672 "zone_append": false, 00:15:30.672 "compare": false, 00:15:30.672 "compare_and_write": false, 00:15:30.672 "abort": true, 00:15:30.672 "seek_hole": false, 00:15:30.672 "seek_data": false, 00:15:30.672 "copy": true, 00:15:30.672 "nvme_iov_md": false 00:15:30.672 }, 00:15:30.672 "memory_domains": [ 00:15:30.672 { 00:15:30.672 "dma_device_id": "system", 00:15:30.672 "dma_device_type": 1 00:15:30.672 }, 00:15:30.672 { 00:15:30.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.672 "dma_device_type": 2 00:15:30.672 } 00:15:30.672 ], 00:15:30.672 "driver_specific": {} 00:15:30.672 } 00:15:30.672 ] 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.672 22:22:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.930 22:22:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.930 "name": "Existed_Raid", 00:15:30.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.930 "strip_size_kb": 64, 00:15:30.930 "state": "configuring", 00:15:30.930 "raid_level": "concat", 00:15:30.930 "superblock": false, 00:15:30.930 "num_base_bdevs": 3, 00:15:30.930 "num_base_bdevs_discovered": 1, 00:15:30.930 "num_base_bdevs_operational": 3, 00:15:30.930 "base_bdevs_list": [ 00:15:30.930 { 00:15:30.930 "name": "BaseBdev1", 00:15:30.930 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:30.930 "is_configured": true, 00:15:30.930 "data_offset": 0, 00:15:30.930 "data_size": 65536 00:15:30.930 }, 00:15:30.930 { 00:15:30.930 "name": "BaseBdev2", 00:15:30.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.930 "is_configured": false, 00:15:30.930 "data_offset": 0, 00:15:30.930 "data_size": 0 00:15:30.930 }, 00:15:30.930 { 00:15:30.930 "name": "BaseBdev3", 00:15:30.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.930 "is_configured": false, 00:15:30.930 "data_offset": 0, 00:15:30.930 "data_size": 0 00:15:30.930 } 00:15:30.930 ] 00:15:30.930 }' 00:15:30.930 22:22:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.930 22:22:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.497 22:22:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:31.755 [2024-07-12 22:22:41.871422] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:31.755 [2024-07-12 22:22:41.871465] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2c310 name Existed_Raid, state configuring 00:15:31.755 22:22:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:32.014 [2024-07-12 22:22:42.116111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.014 [2024-07-12 22:22:42.117573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:32.014 [2024-07-12 22:22:42.117608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:32.014 [2024-07-12 22:22:42.117618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:32.014 [2024-07-12 22:22:42.117630] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.014 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.274 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.274 "name": "Existed_Raid", 00:15:32.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.274 "strip_size_kb": 64, 00:15:32.274 "state": "configuring", 00:15:32.274 "raid_level": "concat", 00:15:32.274 "superblock": false, 00:15:32.274 "num_base_bdevs": 3, 00:15:32.274 "num_base_bdevs_discovered": 1, 00:15:32.274 "num_base_bdevs_operational": 3, 00:15:32.274 "base_bdevs_list": [ 00:15:32.274 { 00:15:32.274 "name": "BaseBdev1", 00:15:32.274 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:32.274 "is_configured": true, 00:15:32.274 "data_offset": 0, 00:15:32.274 "data_size": 65536 00:15:32.274 }, 00:15:32.274 { 00:15:32.274 "name": "BaseBdev2", 00:15:32.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.274 "is_configured": false, 00:15:32.274 "data_offset": 0, 00:15:32.274 "data_size": 0 00:15:32.274 }, 00:15:32.274 { 00:15:32.274 "name": "BaseBdev3", 00:15:32.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.274 "is_configured": false, 00:15:32.274 "data_offset": 0, 00:15:32.274 "data_size": 0 00:15:32.274 } 00:15:32.274 ] 00:15:32.274 }' 00:15:32.274 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.274 22:22:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.842 22:22:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:33.100 [2024-07-12 22:22:43.218420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:33.100 BaseBdev2 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:33.100 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:33.360 [ 00:15:33.360 { 00:15:33.360 "name": "BaseBdev2", 00:15:33.360 "aliases": [ 00:15:33.360 "0ec6c975-8ee6-4694-b8db-c9e69880e93e" 00:15:33.360 ], 00:15:33.360 "product_name": "Malloc disk", 00:15:33.360 "block_size": 512, 00:15:33.360 "num_blocks": 65536, 00:15:33.360 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:33.360 "assigned_rate_limits": { 00:15:33.360 "rw_ios_per_sec": 0, 00:15:33.360 "rw_mbytes_per_sec": 0, 00:15:33.360 "r_mbytes_per_sec": 0, 00:15:33.360 "w_mbytes_per_sec": 0 00:15:33.360 }, 00:15:33.360 "claimed": true, 00:15:33.360 "claim_type": "exclusive_write", 00:15:33.360 "zoned": false, 00:15:33.360 "supported_io_types": { 00:15:33.360 "read": true, 00:15:33.360 "write": true, 00:15:33.360 "unmap": true, 00:15:33.360 "flush": true, 00:15:33.360 "reset": true, 00:15:33.360 "nvme_admin": false, 00:15:33.360 "nvme_io": false, 00:15:33.360 "nvme_io_md": false, 00:15:33.360 "write_zeroes": true, 00:15:33.360 "zcopy": true, 00:15:33.360 "get_zone_info": false, 00:15:33.360 "zone_management": false, 00:15:33.360 "zone_append": false, 00:15:33.360 "compare": false, 00:15:33.360 "compare_and_write": false, 00:15:33.360 "abort": true, 00:15:33.360 "seek_hole": false, 00:15:33.360 "seek_data": false, 00:15:33.360 "copy": true, 00:15:33.360 "nvme_iov_md": false 00:15:33.360 }, 00:15:33.360 "memory_domains": [ 00:15:33.360 { 00:15:33.360 "dma_device_id": "system", 00:15:33.360 "dma_device_type": 1 00:15:33.360 }, 00:15:33.360 { 00:15:33.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.360 "dma_device_type": 2 00:15:33.360 } 00:15:33.360 ], 00:15:33.360 "driver_specific": {} 00:15:33.360 } 00:15:33.360 ] 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.360 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.620 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.620 "name": "Existed_Raid", 00:15:33.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.620 "strip_size_kb": 64, 00:15:33.620 "state": "configuring", 00:15:33.620 "raid_level": "concat", 00:15:33.620 "superblock": false, 00:15:33.620 "num_base_bdevs": 3, 00:15:33.620 "num_base_bdevs_discovered": 2, 00:15:33.620 "num_base_bdevs_operational": 3, 00:15:33.620 "base_bdevs_list": [ 00:15:33.620 { 00:15:33.620 "name": "BaseBdev1", 00:15:33.620 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:33.620 "is_configured": true, 00:15:33.620 "data_offset": 0, 00:15:33.620 "data_size": 65536 00:15:33.620 }, 00:15:33.620 { 00:15:33.620 "name": "BaseBdev2", 00:15:33.620 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:33.620 "is_configured": true, 00:15:33.620 "data_offset": 0, 00:15:33.620 "data_size": 65536 00:15:33.620 }, 00:15:33.620 { 00:15:33.620 "name": "BaseBdev3", 00:15:33.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.620 "is_configured": false, 00:15:33.620 "data_offset": 0, 00:15:33.620 "data_size": 0 00:15:33.620 } 00:15:33.620 ] 00:15:33.620 }' 00:15:33.620 22:22:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.620 22:22:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.188 22:22:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:34.447 [2024-07-12 22:22:44.685777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:34.447 [2024-07-12 22:22:44.685828] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb2d400 00:15:34.447 [2024-07-12 22:22:44.685838] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:34.447 [2024-07-12 22:22:44.686090] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2cef0 00:15:34.447 [2024-07-12 22:22:44.686211] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb2d400 00:15:34.447 [2024-07-12 22:22:44.686221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb2d400 00:15:34.447 [2024-07-12 22:22:44.686394] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:34.447 BaseBdev3 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:34.447 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:34.706 22:22:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:34.965 [ 00:15:34.965 { 00:15:34.965 "name": "BaseBdev3", 00:15:34.965 "aliases": [ 00:15:34.965 "00cfb204-62ac-4b95-84e1-bd7759820aeb" 00:15:34.965 ], 00:15:34.965 "product_name": "Malloc disk", 00:15:34.965 "block_size": 512, 00:15:34.965 "num_blocks": 65536, 00:15:34.965 "uuid": "00cfb204-62ac-4b95-84e1-bd7759820aeb", 00:15:34.965 "assigned_rate_limits": { 00:15:34.965 "rw_ios_per_sec": 0, 00:15:34.965 "rw_mbytes_per_sec": 0, 00:15:34.965 "r_mbytes_per_sec": 0, 00:15:34.965 "w_mbytes_per_sec": 0 00:15:34.965 }, 00:15:34.965 "claimed": true, 00:15:34.965 "claim_type": "exclusive_write", 00:15:34.965 "zoned": false, 00:15:34.965 "supported_io_types": { 00:15:34.965 "read": true, 00:15:34.965 "write": true, 00:15:34.965 "unmap": true, 00:15:34.965 "flush": true, 00:15:34.965 "reset": true, 00:15:34.965 "nvme_admin": false, 00:15:34.965 "nvme_io": false, 00:15:34.965 "nvme_io_md": false, 00:15:34.965 "write_zeroes": true, 00:15:34.965 "zcopy": true, 00:15:34.965 "get_zone_info": false, 00:15:34.965 "zone_management": false, 00:15:34.965 "zone_append": false, 00:15:34.965 "compare": false, 00:15:34.965 "compare_and_write": false, 00:15:34.965 "abort": true, 00:15:34.965 "seek_hole": false, 00:15:34.965 "seek_data": false, 00:15:34.965 "copy": true, 00:15:34.965 "nvme_iov_md": false 00:15:34.965 }, 00:15:34.965 "memory_domains": [ 00:15:34.965 { 00:15:34.965 "dma_device_id": "system", 00:15:34.965 "dma_device_type": 1 00:15:34.965 }, 00:15:34.965 { 00:15:34.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.965 "dma_device_type": 2 00:15:34.965 } 00:15:34.965 ], 00:15:34.966 "driver_specific": {} 00:15:34.966 } 00:15:34.966 ] 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.966 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.226 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.226 "name": "Existed_Raid", 00:15:35.226 "uuid": "d45aeaa9-54dd-4fda-9922-dc8a1166694b", 00:15:35.226 "strip_size_kb": 64, 00:15:35.226 "state": "online", 00:15:35.226 "raid_level": "concat", 00:15:35.226 "superblock": false, 00:15:35.226 "num_base_bdevs": 3, 00:15:35.226 "num_base_bdevs_discovered": 3, 00:15:35.226 "num_base_bdevs_operational": 3, 00:15:35.226 "base_bdevs_list": [ 00:15:35.226 { 00:15:35.226 "name": "BaseBdev1", 00:15:35.226 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:35.226 "is_configured": true, 00:15:35.226 "data_offset": 0, 00:15:35.226 "data_size": 65536 00:15:35.226 }, 00:15:35.226 { 00:15:35.226 "name": "BaseBdev2", 00:15:35.226 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:35.226 "is_configured": true, 00:15:35.226 "data_offset": 0, 00:15:35.226 "data_size": 65536 00:15:35.226 }, 00:15:35.226 { 00:15:35.226 "name": "BaseBdev3", 00:15:35.226 "uuid": "00cfb204-62ac-4b95-84e1-bd7759820aeb", 00:15:35.226 "is_configured": true, 00:15:35.226 "data_offset": 0, 00:15:35.226 "data_size": 65536 00:15:35.226 } 00:15:35.226 ] 00:15:35.226 }' 00:15:35.226 22:22:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.226 22:22:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:35.795 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:36.055 [2024-07-12 22:22:46.174189] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:36.055 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:36.055 "name": "Existed_Raid", 00:15:36.055 "aliases": [ 00:15:36.055 "d45aeaa9-54dd-4fda-9922-dc8a1166694b" 00:15:36.055 ], 00:15:36.055 "product_name": "Raid Volume", 00:15:36.055 "block_size": 512, 00:15:36.055 "num_blocks": 196608, 00:15:36.055 "uuid": "d45aeaa9-54dd-4fda-9922-dc8a1166694b", 00:15:36.055 "assigned_rate_limits": { 00:15:36.055 "rw_ios_per_sec": 0, 00:15:36.055 "rw_mbytes_per_sec": 0, 00:15:36.055 "r_mbytes_per_sec": 0, 00:15:36.055 "w_mbytes_per_sec": 0 00:15:36.055 }, 00:15:36.055 "claimed": false, 00:15:36.055 "zoned": false, 00:15:36.055 "supported_io_types": { 00:15:36.055 "read": true, 00:15:36.055 "write": true, 00:15:36.055 "unmap": true, 00:15:36.055 "flush": true, 00:15:36.055 "reset": true, 00:15:36.055 "nvme_admin": false, 00:15:36.055 "nvme_io": false, 00:15:36.055 "nvme_io_md": false, 00:15:36.055 "write_zeroes": true, 00:15:36.055 "zcopy": false, 00:15:36.055 "get_zone_info": false, 00:15:36.055 "zone_management": false, 00:15:36.055 "zone_append": false, 00:15:36.055 "compare": false, 00:15:36.055 "compare_and_write": false, 00:15:36.055 "abort": false, 00:15:36.055 "seek_hole": false, 00:15:36.055 "seek_data": false, 00:15:36.055 "copy": false, 00:15:36.055 "nvme_iov_md": false 00:15:36.055 }, 00:15:36.055 "memory_domains": [ 00:15:36.055 { 00:15:36.055 "dma_device_id": "system", 00:15:36.055 "dma_device_type": 1 00:15:36.055 }, 00:15:36.055 { 00:15:36.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.055 "dma_device_type": 2 00:15:36.055 }, 00:15:36.055 { 00:15:36.055 "dma_device_id": "system", 00:15:36.055 "dma_device_type": 1 00:15:36.055 }, 00:15:36.055 { 00:15:36.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.055 "dma_device_type": 2 00:15:36.055 }, 00:15:36.055 { 00:15:36.055 "dma_device_id": "system", 00:15:36.055 "dma_device_type": 1 00:15:36.055 }, 00:15:36.055 { 00:15:36.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.055 "dma_device_type": 2 00:15:36.055 } 00:15:36.055 ], 00:15:36.055 "driver_specific": { 00:15:36.055 "raid": { 00:15:36.055 "uuid": "d45aeaa9-54dd-4fda-9922-dc8a1166694b", 00:15:36.055 "strip_size_kb": 64, 00:15:36.055 "state": "online", 00:15:36.055 "raid_level": "concat", 00:15:36.055 "superblock": false, 00:15:36.055 "num_base_bdevs": 3, 00:15:36.056 "num_base_bdevs_discovered": 3, 00:15:36.056 "num_base_bdevs_operational": 3, 00:15:36.056 "base_bdevs_list": [ 00:15:36.056 { 00:15:36.056 "name": "BaseBdev1", 00:15:36.056 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:36.056 "is_configured": true, 00:15:36.056 "data_offset": 0, 00:15:36.056 "data_size": 65536 00:15:36.056 }, 00:15:36.056 { 00:15:36.056 "name": "BaseBdev2", 00:15:36.056 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:36.056 "is_configured": true, 00:15:36.056 "data_offset": 0, 00:15:36.056 "data_size": 65536 00:15:36.056 }, 00:15:36.056 { 00:15:36.056 "name": "BaseBdev3", 00:15:36.056 "uuid": "00cfb204-62ac-4b95-84e1-bd7759820aeb", 00:15:36.056 "is_configured": true, 00:15:36.056 "data_offset": 0, 00:15:36.056 "data_size": 65536 00:15:36.056 } 00:15:36.056 ] 00:15:36.056 } 00:15:36.056 } 00:15:36.056 }' 00:15:36.056 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:36.056 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:36.056 BaseBdev2 00:15:36.056 BaseBdev3' 00:15:36.056 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:36.056 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:36.056 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:36.316 "name": "BaseBdev1", 00:15:36.316 "aliases": [ 00:15:36.316 "923cf4b7-ab8c-4072-bd26-424f79e19087" 00:15:36.316 ], 00:15:36.316 "product_name": "Malloc disk", 00:15:36.316 "block_size": 512, 00:15:36.316 "num_blocks": 65536, 00:15:36.316 "uuid": "923cf4b7-ab8c-4072-bd26-424f79e19087", 00:15:36.316 "assigned_rate_limits": { 00:15:36.316 "rw_ios_per_sec": 0, 00:15:36.316 "rw_mbytes_per_sec": 0, 00:15:36.316 "r_mbytes_per_sec": 0, 00:15:36.316 "w_mbytes_per_sec": 0 00:15:36.316 }, 00:15:36.316 "claimed": true, 00:15:36.316 "claim_type": "exclusive_write", 00:15:36.316 "zoned": false, 00:15:36.316 "supported_io_types": { 00:15:36.316 "read": true, 00:15:36.316 "write": true, 00:15:36.316 "unmap": true, 00:15:36.316 "flush": true, 00:15:36.316 "reset": true, 00:15:36.316 "nvme_admin": false, 00:15:36.316 "nvme_io": false, 00:15:36.316 "nvme_io_md": false, 00:15:36.316 "write_zeroes": true, 00:15:36.316 "zcopy": true, 00:15:36.316 "get_zone_info": false, 00:15:36.316 "zone_management": false, 00:15:36.316 "zone_append": false, 00:15:36.316 "compare": false, 00:15:36.316 "compare_and_write": false, 00:15:36.316 "abort": true, 00:15:36.316 "seek_hole": false, 00:15:36.316 "seek_data": false, 00:15:36.316 "copy": true, 00:15:36.316 "nvme_iov_md": false 00:15:36.316 }, 00:15:36.316 "memory_domains": [ 00:15:36.316 { 00:15:36.316 "dma_device_id": "system", 00:15:36.316 "dma_device_type": 1 00:15:36.316 }, 00:15:36.316 { 00:15:36.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.316 "dma_device_type": 2 00:15:36.316 } 00:15:36.316 ], 00:15:36.316 "driver_specific": {} 00:15:36.316 }' 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.316 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:36.576 22:22:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:36.835 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:36.835 "name": "BaseBdev2", 00:15:36.835 "aliases": [ 00:15:36.835 "0ec6c975-8ee6-4694-b8db-c9e69880e93e" 00:15:36.835 ], 00:15:36.835 "product_name": "Malloc disk", 00:15:36.835 "block_size": 512, 00:15:36.835 "num_blocks": 65536, 00:15:36.835 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:36.835 "assigned_rate_limits": { 00:15:36.835 "rw_ios_per_sec": 0, 00:15:36.835 "rw_mbytes_per_sec": 0, 00:15:36.835 "r_mbytes_per_sec": 0, 00:15:36.835 "w_mbytes_per_sec": 0 00:15:36.835 }, 00:15:36.835 "claimed": true, 00:15:36.835 "claim_type": "exclusive_write", 00:15:36.835 "zoned": false, 00:15:36.835 "supported_io_types": { 00:15:36.835 "read": true, 00:15:36.835 "write": true, 00:15:36.835 "unmap": true, 00:15:36.835 "flush": true, 00:15:36.835 "reset": true, 00:15:36.835 "nvme_admin": false, 00:15:36.835 "nvme_io": false, 00:15:36.835 "nvme_io_md": false, 00:15:36.835 "write_zeroes": true, 00:15:36.835 "zcopy": true, 00:15:36.835 "get_zone_info": false, 00:15:36.835 "zone_management": false, 00:15:36.835 "zone_append": false, 00:15:36.835 "compare": false, 00:15:36.835 "compare_and_write": false, 00:15:36.835 "abort": true, 00:15:36.835 "seek_hole": false, 00:15:36.835 "seek_data": false, 00:15:36.835 "copy": true, 00:15:36.835 "nvme_iov_md": false 00:15:36.835 }, 00:15:36.835 "memory_domains": [ 00:15:36.835 { 00:15:36.835 "dma_device_id": "system", 00:15:36.835 "dma_device_type": 1 00:15:36.835 }, 00:15:36.835 { 00:15:36.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.835 "dma_device_type": 2 00:15:36.836 } 00:15:36.836 ], 00:15:36.836 "driver_specific": {} 00:15:36.836 }' 00:15:36.836 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.836 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.095 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.355 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.355 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.355 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:37.355 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.614 "name": "BaseBdev3", 00:15:37.614 "aliases": [ 00:15:37.614 "00cfb204-62ac-4b95-84e1-bd7759820aeb" 00:15:37.614 ], 00:15:37.614 "product_name": "Malloc disk", 00:15:37.614 "block_size": 512, 00:15:37.614 "num_blocks": 65536, 00:15:37.614 "uuid": "00cfb204-62ac-4b95-84e1-bd7759820aeb", 00:15:37.614 "assigned_rate_limits": { 00:15:37.614 "rw_ios_per_sec": 0, 00:15:37.614 "rw_mbytes_per_sec": 0, 00:15:37.614 "r_mbytes_per_sec": 0, 00:15:37.614 "w_mbytes_per_sec": 0 00:15:37.614 }, 00:15:37.614 "claimed": true, 00:15:37.614 "claim_type": "exclusive_write", 00:15:37.614 "zoned": false, 00:15:37.614 "supported_io_types": { 00:15:37.614 "read": true, 00:15:37.614 "write": true, 00:15:37.614 "unmap": true, 00:15:37.614 "flush": true, 00:15:37.614 "reset": true, 00:15:37.614 "nvme_admin": false, 00:15:37.614 "nvme_io": false, 00:15:37.614 "nvme_io_md": false, 00:15:37.614 "write_zeroes": true, 00:15:37.614 "zcopy": true, 00:15:37.614 "get_zone_info": false, 00:15:37.614 "zone_management": false, 00:15:37.614 "zone_append": false, 00:15:37.614 "compare": false, 00:15:37.614 "compare_and_write": false, 00:15:37.614 "abort": true, 00:15:37.614 "seek_hole": false, 00:15:37.614 "seek_data": false, 00:15:37.614 "copy": true, 00:15:37.614 "nvme_iov_md": false 00:15:37.614 }, 00:15:37.614 "memory_domains": [ 00:15:37.614 { 00:15:37.614 "dma_device_id": "system", 00:15:37.614 "dma_device_type": 1 00:15:37.614 }, 00:15:37.614 { 00:15:37.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.614 "dma_device_type": 2 00:15:37.614 } 00:15:37.614 ], 00:15:37.614 "driver_specific": {} 00:15:37.614 }' 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.614 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.874 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.874 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.874 22:22:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.874 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.874 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:38.134 [2024-07-12 22:22:48.267476] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:38.134 [2024-07-12 22:22:48.267506] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:38.134 [2024-07-12 22:22:48.267548] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.134 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.393 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.393 "name": "Existed_Raid", 00:15:38.393 "uuid": "d45aeaa9-54dd-4fda-9922-dc8a1166694b", 00:15:38.393 "strip_size_kb": 64, 00:15:38.393 "state": "offline", 00:15:38.393 "raid_level": "concat", 00:15:38.393 "superblock": false, 00:15:38.393 "num_base_bdevs": 3, 00:15:38.393 "num_base_bdevs_discovered": 2, 00:15:38.393 "num_base_bdevs_operational": 2, 00:15:38.393 "base_bdevs_list": [ 00:15:38.393 { 00:15:38.393 "name": null, 00:15:38.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.393 "is_configured": false, 00:15:38.393 "data_offset": 0, 00:15:38.393 "data_size": 65536 00:15:38.393 }, 00:15:38.393 { 00:15:38.393 "name": "BaseBdev2", 00:15:38.393 "uuid": "0ec6c975-8ee6-4694-b8db-c9e69880e93e", 00:15:38.393 "is_configured": true, 00:15:38.393 "data_offset": 0, 00:15:38.393 "data_size": 65536 00:15:38.393 }, 00:15:38.393 { 00:15:38.393 "name": "BaseBdev3", 00:15:38.393 "uuid": "00cfb204-62ac-4b95-84e1-bd7759820aeb", 00:15:38.393 "is_configured": true, 00:15:38.393 "data_offset": 0, 00:15:38.393 "data_size": 65536 00:15:38.393 } 00:15:38.393 ] 00:15:38.393 }' 00:15:38.393 22:22:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.393 22:22:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.962 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:38.962 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:38.962 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:38.962 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.221 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:39.221 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:39.221 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:39.221 [2024-07-12 22:22:49.519860] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:39.480 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:39.481 22:22:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:39.740 [2024-07-12 22:22:50.015750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:39.740 [2024-07-12 22:22:50.015802] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2d400 name Existed_Raid, state offline 00:15:39.740 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:39.740 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:39.740 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.740 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:40.000 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:40.260 BaseBdev2 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:40.260 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:40.519 22:22:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:40.778 [ 00:15:40.778 { 00:15:40.778 "name": "BaseBdev2", 00:15:40.778 "aliases": [ 00:15:40.778 "5a449f96-a890-4ee9-b255-579c12499022" 00:15:40.778 ], 00:15:40.778 "product_name": "Malloc disk", 00:15:40.778 "block_size": 512, 00:15:40.778 "num_blocks": 65536, 00:15:40.778 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:40.778 "assigned_rate_limits": { 00:15:40.778 "rw_ios_per_sec": 0, 00:15:40.778 "rw_mbytes_per_sec": 0, 00:15:40.778 "r_mbytes_per_sec": 0, 00:15:40.778 "w_mbytes_per_sec": 0 00:15:40.778 }, 00:15:40.778 "claimed": false, 00:15:40.778 "zoned": false, 00:15:40.778 "supported_io_types": { 00:15:40.778 "read": true, 00:15:40.778 "write": true, 00:15:40.778 "unmap": true, 00:15:40.778 "flush": true, 00:15:40.778 "reset": true, 00:15:40.778 "nvme_admin": false, 00:15:40.778 "nvme_io": false, 00:15:40.778 "nvme_io_md": false, 00:15:40.778 "write_zeroes": true, 00:15:40.778 "zcopy": true, 00:15:40.778 "get_zone_info": false, 00:15:40.778 "zone_management": false, 00:15:40.778 "zone_append": false, 00:15:40.778 "compare": false, 00:15:40.778 "compare_and_write": false, 00:15:40.778 "abort": true, 00:15:40.778 "seek_hole": false, 00:15:40.778 "seek_data": false, 00:15:40.778 "copy": true, 00:15:40.778 "nvme_iov_md": false 00:15:40.778 }, 00:15:40.778 "memory_domains": [ 00:15:40.778 { 00:15:40.778 "dma_device_id": "system", 00:15:40.778 "dma_device_type": 1 00:15:40.778 }, 00:15:40.778 { 00:15:40.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.778 "dma_device_type": 2 00:15:40.778 } 00:15:40.778 ], 00:15:40.778 "driver_specific": {} 00:15:40.778 } 00:15:40.778 ] 00:15:40.778 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:40.778 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:40.778 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:40.778 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:41.038 BaseBdev3 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:41.038 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:41.297 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:41.557 [ 00:15:41.557 { 00:15:41.557 "name": "BaseBdev3", 00:15:41.557 "aliases": [ 00:15:41.557 "13f74e9e-4228-42b5-8b49-5032f9157bd0" 00:15:41.557 ], 00:15:41.557 "product_name": "Malloc disk", 00:15:41.557 "block_size": 512, 00:15:41.557 "num_blocks": 65536, 00:15:41.557 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:41.557 "assigned_rate_limits": { 00:15:41.557 "rw_ios_per_sec": 0, 00:15:41.557 "rw_mbytes_per_sec": 0, 00:15:41.557 "r_mbytes_per_sec": 0, 00:15:41.557 "w_mbytes_per_sec": 0 00:15:41.557 }, 00:15:41.557 "claimed": false, 00:15:41.557 "zoned": false, 00:15:41.557 "supported_io_types": { 00:15:41.557 "read": true, 00:15:41.557 "write": true, 00:15:41.557 "unmap": true, 00:15:41.557 "flush": true, 00:15:41.557 "reset": true, 00:15:41.557 "nvme_admin": false, 00:15:41.557 "nvme_io": false, 00:15:41.557 "nvme_io_md": false, 00:15:41.557 "write_zeroes": true, 00:15:41.557 "zcopy": true, 00:15:41.557 "get_zone_info": false, 00:15:41.557 "zone_management": false, 00:15:41.557 "zone_append": false, 00:15:41.557 "compare": false, 00:15:41.557 "compare_and_write": false, 00:15:41.557 "abort": true, 00:15:41.557 "seek_hole": false, 00:15:41.557 "seek_data": false, 00:15:41.557 "copy": true, 00:15:41.557 "nvme_iov_md": false 00:15:41.557 }, 00:15:41.557 "memory_domains": [ 00:15:41.557 { 00:15:41.557 "dma_device_id": "system", 00:15:41.557 "dma_device_type": 1 00:15:41.557 }, 00:15:41.557 { 00:15:41.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.557 "dma_device_type": 2 00:15:41.557 } 00:15:41.557 ], 00:15:41.557 "driver_specific": {} 00:15:41.557 } 00:15:41.557 ] 00:15:41.557 22:22:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:41.557 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:41.557 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:41.557 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:41.817 [2024-07-12 22:22:51.970131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:41.817 [2024-07-12 22:22:51.970178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:41.817 [2024-07-12 22:22:51.970198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:41.817 [2024-07-12 22:22:51.971578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.817 22:22:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:42.076 22:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.076 "name": "Existed_Raid", 00:15:42.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.076 "strip_size_kb": 64, 00:15:42.076 "state": "configuring", 00:15:42.076 "raid_level": "concat", 00:15:42.076 "superblock": false, 00:15:42.076 "num_base_bdevs": 3, 00:15:42.076 "num_base_bdevs_discovered": 2, 00:15:42.076 "num_base_bdevs_operational": 3, 00:15:42.076 "base_bdevs_list": [ 00:15:42.076 { 00:15:42.076 "name": "BaseBdev1", 00:15:42.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.076 "is_configured": false, 00:15:42.076 "data_offset": 0, 00:15:42.076 "data_size": 0 00:15:42.076 }, 00:15:42.076 { 00:15:42.076 "name": "BaseBdev2", 00:15:42.076 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:42.076 "is_configured": true, 00:15:42.076 "data_offset": 0, 00:15:42.076 "data_size": 65536 00:15:42.076 }, 00:15:42.076 { 00:15:42.076 "name": "BaseBdev3", 00:15:42.076 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:42.076 "is_configured": true, 00:15:42.076 "data_offset": 0, 00:15:42.076 "data_size": 65536 00:15:42.076 } 00:15:42.076 ] 00:15:42.076 }' 00:15:42.077 22:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.077 22:22:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.647 22:22:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:42.913 [2024-07-12 22:22:52.992834] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.913 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:43.171 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.171 "name": "Existed_Raid", 00:15:43.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.171 "strip_size_kb": 64, 00:15:43.171 "state": "configuring", 00:15:43.171 "raid_level": "concat", 00:15:43.171 "superblock": false, 00:15:43.171 "num_base_bdevs": 3, 00:15:43.171 "num_base_bdevs_discovered": 1, 00:15:43.171 "num_base_bdevs_operational": 3, 00:15:43.171 "base_bdevs_list": [ 00:15:43.171 { 00:15:43.171 "name": "BaseBdev1", 00:15:43.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.171 "is_configured": false, 00:15:43.171 "data_offset": 0, 00:15:43.171 "data_size": 0 00:15:43.171 }, 00:15:43.171 { 00:15:43.171 "name": null, 00:15:43.171 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:43.171 "is_configured": false, 00:15:43.171 "data_offset": 0, 00:15:43.171 "data_size": 65536 00:15:43.171 }, 00:15:43.171 { 00:15:43.171 "name": "BaseBdev3", 00:15:43.171 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:43.171 "is_configured": true, 00:15:43.171 "data_offset": 0, 00:15:43.171 "data_size": 65536 00:15:43.171 } 00:15:43.171 ] 00:15:43.171 }' 00:15:43.171 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.171 22:22:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.740 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.740 22:22:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:43.999 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:43.999 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:44.281 [2024-07-12 22:22:54.351797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:44.281 BaseBdev1 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:44.281 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:44.541 [ 00:15:44.541 { 00:15:44.541 "name": "BaseBdev1", 00:15:44.541 "aliases": [ 00:15:44.541 "9680b39e-9daa-45f0-aec9-332f0636b734" 00:15:44.541 ], 00:15:44.541 "product_name": "Malloc disk", 00:15:44.541 "block_size": 512, 00:15:44.541 "num_blocks": 65536, 00:15:44.541 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:44.541 "assigned_rate_limits": { 00:15:44.541 "rw_ios_per_sec": 0, 00:15:44.541 "rw_mbytes_per_sec": 0, 00:15:44.541 "r_mbytes_per_sec": 0, 00:15:44.541 "w_mbytes_per_sec": 0 00:15:44.541 }, 00:15:44.541 "claimed": true, 00:15:44.541 "claim_type": "exclusive_write", 00:15:44.541 "zoned": false, 00:15:44.541 "supported_io_types": { 00:15:44.541 "read": true, 00:15:44.541 "write": true, 00:15:44.541 "unmap": true, 00:15:44.541 "flush": true, 00:15:44.541 "reset": true, 00:15:44.541 "nvme_admin": false, 00:15:44.541 "nvme_io": false, 00:15:44.541 "nvme_io_md": false, 00:15:44.541 "write_zeroes": true, 00:15:44.541 "zcopy": true, 00:15:44.541 "get_zone_info": false, 00:15:44.541 "zone_management": false, 00:15:44.541 "zone_append": false, 00:15:44.541 "compare": false, 00:15:44.541 "compare_and_write": false, 00:15:44.541 "abort": true, 00:15:44.541 "seek_hole": false, 00:15:44.541 "seek_data": false, 00:15:44.541 "copy": true, 00:15:44.541 "nvme_iov_md": false 00:15:44.541 }, 00:15:44.541 "memory_domains": [ 00:15:44.541 { 00:15:44.541 "dma_device_id": "system", 00:15:44.541 "dma_device_type": 1 00:15:44.541 }, 00:15:44.541 { 00:15:44.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.541 "dma_device_type": 2 00:15:44.541 } 00:15:44.541 ], 00:15:44.541 "driver_specific": {} 00:15:44.541 } 00:15:44.541 ] 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.541 22:22:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.800 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.800 "name": "Existed_Raid", 00:15:44.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.800 "strip_size_kb": 64, 00:15:44.800 "state": "configuring", 00:15:44.800 "raid_level": "concat", 00:15:44.800 "superblock": false, 00:15:44.800 "num_base_bdevs": 3, 00:15:44.800 "num_base_bdevs_discovered": 2, 00:15:44.800 "num_base_bdevs_operational": 3, 00:15:44.800 "base_bdevs_list": [ 00:15:44.800 { 00:15:44.800 "name": "BaseBdev1", 00:15:44.800 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:44.800 "is_configured": true, 00:15:44.800 "data_offset": 0, 00:15:44.800 "data_size": 65536 00:15:44.800 }, 00:15:44.800 { 00:15:44.800 "name": null, 00:15:44.800 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:44.800 "is_configured": false, 00:15:44.800 "data_offset": 0, 00:15:44.800 "data_size": 65536 00:15:44.800 }, 00:15:44.800 { 00:15:44.800 "name": "BaseBdev3", 00:15:44.800 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:44.800 "is_configured": true, 00:15:44.800 "data_offset": 0, 00:15:44.800 "data_size": 65536 00:15:44.800 } 00:15:44.800 ] 00:15:44.800 }' 00:15:44.800 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.800 22:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.370 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.370 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:45.630 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:45.630 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:45.889 [2024-07-12 22:22:55.972121] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.889 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.890 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.890 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.890 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.890 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.890 22:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.149 22:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.149 "name": "Existed_Raid", 00:15:46.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.149 "strip_size_kb": 64, 00:15:46.149 "state": "configuring", 00:15:46.149 "raid_level": "concat", 00:15:46.149 "superblock": false, 00:15:46.149 "num_base_bdevs": 3, 00:15:46.149 "num_base_bdevs_discovered": 1, 00:15:46.149 "num_base_bdevs_operational": 3, 00:15:46.149 "base_bdevs_list": [ 00:15:46.149 { 00:15:46.149 "name": "BaseBdev1", 00:15:46.149 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:46.149 "is_configured": true, 00:15:46.149 "data_offset": 0, 00:15:46.149 "data_size": 65536 00:15:46.149 }, 00:15:46.149 { 00:15:46.149 "name": null, 00:15:46.149 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:46.149 "is_configured": false, 00:15:46.149 "data_offset": 0, 00:15:46.150 "data_size": 65536 00:15:46.150 }, 00:15:46.150 { 00:15:46.150 "name": null, 00:15:46.150 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:46.150 "is_configured": false, 00:15:46.150 "data_offset": 0, 00:15:46.150 "data_size": 65536 00:15:46.150 } 00:15:46.150 ] 00:15:46.150 }' 00:15:46.150 22:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.150 22:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.718 22:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.718 22:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:46.977 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:46.977 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:46.977 [2024-07-12 22:22:57.295650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:47.237 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.238 "name": "Existed_Raid", 00:15:47.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.238 "strip_size_kb": 64, 00:15:47.238 "state": "configuring", 00:15:47.238 "raid_level": "concat", 00:15:47.238 "superblock": false, 00:15:47.238 "num_base_bdevs": 3, 00:15:47.238 "num_base_bdevs_discovered": 2, 00:15:47.238 "num_base_bdevs_operational": 3, 00:15:47.238 "base_bdevs_list": [ 00:15:47.238 { 00:15:47.238 "name": "BaseBdev1", 00:15:47.238 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:47.238 "is_configured": true, 00:15:47.238 "data_offset": 0, 00:15:47.238 "data_size": 65536 00:15:47.238 }, 00:15:47.238 { 00:15:47.238 "name": null, 00:15:47.238 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:47.238 "is_configured": false, 00:15:47.238 "data_offset": 0, 00:15:47.238 "data_size": 65536 00:15:47.238 }, 00:15:47.238 { 00:15:47.238 "name": "BaseBdev3", 00:15:47.238 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:47.238 "is_configured": true, 00:15:47.238 "data_offset": 0, 00:15:47.238 "data_size": 65536 00:15:47.238 } 00:15:47.238 ] 00:15:47.238 }' 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.238 22:22:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.805 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.805 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:48.152 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:48.152 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:48.412 [2024-07-12 22:22:58.494850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.412 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.671 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.671 "name": "Existed_Raid", 00:15:48.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.671 "strip_size_kb": 64, 00:15:48.671 "state": "configuring", 00:15:48.672 "raid_level": "concat", 00:15:48.672 "superblock": false, 00:15:48.672 "num_base_bdevs": 3, 00:15:48.672 "num_base_bdevs_discovered": 1, 00:15:48.672 "num_base_bdevs_operational": 3, 00:15:48.672 "base_bdevs_list": [ 00:15:48.672 { 00:15:48.672 "name": null, 00:15:48.672 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:48.672 "is_configured": false, 00:15:48.672 "data_offset": 0, 00:15:48.672 "data_size": 65536 00:15:48.672 }, 00:15:48.672 { 00:15:48.672 "name": null, 00:15:48.672 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:48.672 "is_configured": false, 00:15:48.672 "data_offset": 0, 00:15:48.672 "data_size": 65536 00:15:48.672 }, 00:15:48.672 { 00:15:48.672 "name": "BaseBdev3", 00:15:48.672 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:48.672 "is_configured": true, 00:15:48.672 "data_offset": 0, 00:15:48.672 "data_size": 65536 00:15:48.672 } 00:15:48.672 ] 00:15:48.672 }' 00:15:48.672 22:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.672 22:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.610 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.610 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:49.610 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:49.610 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:49.869 [2024-07-12 22:22:59.975150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.869 22:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.437 22:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.437 "name": "Existed_Raid", 00:15:50.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.437 "strip_size_kb": 64, 00:15:50.437 "state": "configuring", 00:15:50.437 "raid_level": "concat", 00:15:50.437 "superblock": false, 00:15:50.437 "num_base_bdevs": 3, 00:15:50.437 "num_base_bdevs_discovered": 2, 00:15:50.437 "num_base_bdevs_operational": 3, 00:15:50.437 "base_bdevs_list": [ 00:15:50.437 { 00:15:50.437 "name": null, 00:15:50.437 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:50.437 "is_configured": false, 00:15:50.437 "data_offset": 0, 00:15:50.437 "data_size": 65536 00:15:50.437 }, 00:15:50.437 { 00:15:50.437 "name": "BaseBdev2", 00:15:50.437 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:50.437 "is_configured": true, 00:15:50.437 "data_offset": 0, 00:15:50.437 "data_size": 65536 00:15:50.437 }, 00:15:50.437 { 00:15:50.437 "name": "BaseBdev3", 00:15:50.437 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:50.437 "is_configured": true, 00:15:50.437 "data_offset": 0, 00:15:50.437 "data_size": 65536 00:15:50.437 } 00:15:50.437 ] 00:15:50.437 }' 00:15:50.437 22:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.437 22:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.006 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.006 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:51.006 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:51.006 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.006 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:51.264 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9680b39e-9daa-45f0-aec9-332f0636b734 00:15:51.523 [2024-07-12 22:23:01.756482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:51.523 [2024-07-12 22:23:01.756526] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb2b450 00:15:51.523 [2024-07-12 22:23:01.756535] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:51.523 [2024-07-12 22:23:01.756735] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2ced0 00:15:51.523 [2024-07-12 22:23:01.756851] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb2b450 00:15:51.523 [2024-07-12 22:23:01.756861] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb2b450 00:15:51.523 [2024-07-12 22:23:01.757042] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.523 NewBaseBdev 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:51.523 22:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.782 22:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:52.040 [ 00:15:52.040 { 00:15:52.040 "name": "NewBaseBdev", 00:15:52.040 "aliases": [ 00:15:52.040 "9680b39e-9daa-45f0-aec9-332f0636b734" 00:15:52.040 ], 00:15:52.040 "product_name": "Malloc disk", 00:15:52.040 "block_size": 512, 00:15:52.040 "num_blocks": 65536, 00:15:52.040 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:52.040 "assigned_rate_limits": { 00:15:52.040 "rw_ios_per_sec": 0, 00:15:52.040 "rw_mbytes_per_sec": 0, 00:15:52.040 "r_mbytes_per_sec": 0, 00:15:52.040 "w_mbytes_per_sec": 0 00:15:52.040 }, 00:15:52.040 "claimed": true, 00:15:52.040 "claim_type": "exclusive_write", 00:15:52.040 "zoned": false, 00:15:52.040 "supported_io_types": { 00:15:52.040 "read": true, 00:15:52.040 "write": true, 00:15:52.040 "unmap": true, 00:15:52.040 "flush": true, 00:15:52.040 "reset": true, 00:15:52.040 "nvme_admin": false, 00:15:52.040 "nvme_io": false, 00:15:52.040 "nvme_io_md": false, 00:15:52.040 "write_zeroes": true, 00:15:52.040 "zcopy": true, 00:15:52.040 "get_zone_info": false, 00:15:52.040 "zone_management": false, 00:15:52.040 "zone_append": false, 00:15:52.040 "compare": false, 00:15:52.040 "compare_and_write": false, 00:15:52.040 "abort": true, 00:15:52.040 "seek_hole": false, 00:15:52.040 "seek_data": false, 00:15:52.040 "copy": true, 00:15:52.040 "nvme_iov_md": false 00:15:52.040 }, 00:15:52.040 "memory_domains": [ 00:15:52.040 { 00:15:52.040 "dma_device_id": "system", 00:15:52.040 "dma_device_type": 1 00:15:52.040 }, 00:15:52.040 { 00:15:52.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.040 "dma_device_type": 2 00:15:52.040 } 00:15:52.040 ], 00:15:52.040 "driver_specific": {} 00:15:52.040 } 00:15:52.040 ] 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.040 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.299 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.299 "name": "Existed_Raid", 00:15:52.299 "uuid": "0b540d83-1706-47af-b92a-2f8a520d92b7", 00:15:52.299 "strip_size_kb": 64, 00:15:52.299 "state": "online", 00:15:52.299 "raid_level": "concat", 00:15:52.299 "superblock": false, 00:15:52.299 "num_base_bdevs": 3, 00:15:52.299 "num_base_bdevs_discovered": 3, 00:15:52.299 "num_base_bdevs_operational": 3, 00:15:52.299 "base_bdevs_list": [ 00:15:52.299 { 00:15:52.299 "name": "NewBaseBdev", 00:15:52.299 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:52.299 "is_configured": true, 00:15:52.299 "data_offset": 0, 00:15:52.299 "data_size": 65536 00:15:52.299 }, 00:15:52.299 { 00:15:52.299 "name": "BaseBdev2", 00:15:52.299 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:52.299 "is_configured": true, 00:15:52.299 "data_offset": 0, 00:15:52.299 "data_size": 65536 00:15:52.299 }, 00:15:52.299 { 00:15:52.299 "name": "BaseBdev3", 00:15:52.299 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:52.299 "is_configured": true, 00:15:52.299 "data_offset": 0, 00:15:52.299 "data_size": 65536 00:15:52.299 } 00:15:52.299 ] 00:15:52.299 }' 00:15:52.299 22:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.299 22:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:53.235 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:53.494 [2024-07-12 22:23:03.569589] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:53.494 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:53.494 "name": "Existed_Raid", 00:15:53.494 "aliases": [ 00:15:53.494 "0b540d83-1706-47af-b92a-2f8a520d92b7" 00:15:53.494 ], 00:15:53.494 "product_name": "Raid Volume", 00:15:53.494 "block_size": 512, 00:15:53.494 "num_blocks": 196608, 00:15:53.494 "uuid": "0b540d83-1706-47af-b92a-2f8a520d92b7", 00:15:53.494 "assigned_rate_limits": { 00:15:53.494 "rw_ios_per_sec": 0, 00:15:53.494 "rw_mbytes_per_sec": 0, 00:15:53.494 "r_mbytes_per_sec": 0, 00:15:53.494 "w_mbytes_per_sec": 0 00:15:53.494 }, 00:15:53.494 "claimed": false, 00:15:53.494 "zoned": false, 00:15:53.494 "supported_io_types": { 00:15:53.494 "read": true, 00:15:53.494 "write": true, 00:15:53.494 "unmap": true, 00:15:53.494 "flush": true, 00:15:53.494 "reset": true, 00:15:53.494 "nvme_admin": false, 00:15:53.494 "nvme_io": false, 00:15:53.494 "nvme_io_md": false, 00:15:53.494 "write_zeroes": true, 00:15:53.494 "zcopy": false, 00:15:53.494 "get_zone_info": false, 00:15:53.494 "zone_management": false, 00:15:53.494 "zone_append": false, 00:15:53.494 "compare": false, 00:15:53.494 "compare_and_write": false, 00:15:53.494 "abort": false, 00:15:53.494 "seek_hole": false, 00:15:53.494 "seek_data": false, 00:15:53.494 "copy": false, 00:15:53.494 "nvme_iov_md": false 00:15:53.494 }, 00:15:53.494 "memory_domains": [ 00:15:53.494 { 00:15:53.494 "dma_device_id": "system", 00:15:53.494 "dma_device_type": 1 00:15:53.494 }, 00:15:53.494 { 00:15:53.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.494 "dma_device_type": 2 00:15:53.494 }, 00:15:53.494 { 00:15:53.494 "dma_device_id": "system", 00:15:53.494 "dma_device_type": 1 00:15:53.494 }, 00:15:53.494 { 00:15:53.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.495 "dma_device_type": 2 00:15:53.495 }, 00:15:53.495 { 00:15:53.495 "dma_device_id": "system", 00:15:53.495 "dma_device_type": 1 00:15:53.495 }, 00:15:53.495 { 00:15:53.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.495 "dma_device_type": 2 00:15:53.495 } 00:15:53.495 ], 00:15:53.495 "driver_specific": { 00:15:53.495 "raid": { 00:15:53.495 "uuid": "0b540d83-1706-47af-b92a-2f8a520d92b7", 00:15:53.495 "strip_size_kb": 64, 00:15:53.495 "state": "online", 00:15:53.495 "raid_level": "concat", 00:15:53.495 "superblock": false, 00:15:53.495 "num_base_bdevs": 3, 00:15:53.495 "num_base_bdevs_discovered": 3, 00:15:53.495 "num_base_bdevs_operational": 3, 00:15:53.495 "base_bdevs_list": [ 00:15:53.495 { 00:15:53.495 "name": "NewBaseBdev", 00:15:53.495 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:53.495 "is_configured": true, 00:15:53.495 "data_offset": 0, 00:15:53.495 "data_size": 65536 00:15:53.495 }, 00:15:53.495 { 00:15:53.495 "name": "BaseBdev2", 00:15:53.495 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:53.495 "is_configured": true, 00:15:53.495 "data_offset": 0, 00:15:53.495 "data_size": 65536 00:15:53.495 }, 00:15:53.495 { 00:15:53.495 "name": "BaseBdev3", 00:15:53.495 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:53.495 "is_configured": true, 00:15:53.495 "data_offset": 0, 00:15:53.495 "data_size": 65536 00:15:53.495 } 00:15:53.495 ] 00:15:53.495 } 00:15:53.495 } 00:15:53.495 }' 00:15:53.495 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:53.495 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:53.495 BaseBdev2 00:15:53.495 BaseBdev3' 00:15:53.495 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.495 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:53.495 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.754 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.754 "name": "NewBaseBdev", 00:15:53.754 "aliases": [ 00:15:53.754 "9680b39e-9daa-45f0-aec9-332f0636b734" 00:15:53.754 ], 00:15:53.754 "product_name": "Malloc disk", 00:15:53.754 "block_size": 512, 00:15:53.754 "num_blocks": 65536, 00:15:53.754 "uuid": "9680b39e-9daa-45f0-aec9-332f0636b734", 00:15:53.754 "assigned_rate_limits": { 00:15:53.754 "rw_ios_per_sec": 0, 00:15:53.754 "rw_mbytes_per_sec": 0, 00:15:53.754 "r_mbytes_per_sec": 0, 00:15:53.754 "w_mbytes_per_sec": 0 00:15:53.754 }, 00:15:53.754 "claimed": true, 00:15:53.754 "claim_type": "exclusive_write", 00:15:53.754 "zoned": false, 00:15:53.754 "supported_io_types": { 00:15:53.754 "read": true, 00:15:53.754 "write": true, 00:15:53.754 "unmap": true, 00:15:53.754 "flush": true, 00:15:53.754 "reset": true, 00:15:53.754 "nvme_admin": false, 00:15:53.754 "nvme_io": false, 00:15:53.754 "nvme_io_md": false, 00:15:53.754 "write_zeroes": true, 00:15:53.754 "zcopy": true, 00:15:53.754 "get_zone_info": false, 00:15:53.754 "zone_management": false, 00:15:53.754 "zone_append": false, 00:15:53.754 "compare": false, 00:15:53.754 "compare_and_write": false, 00:15:53.755 "abort": true, 00:15:53.755 "seek_hole": false, 00:15:53.755 "seek_data": false, 00:15:53.755 "copy": true, 00:15:53.755 "nvme_iov_md": false 00:15:53.755 }, 00:15:53.755 "memory_domains": [ 00:15:53.755 { 00:15:53.755 "dma_device_id": "system", 00:15:53.755 "dma_device_type": 1 00:15:53.755 }, 00:15:53.755 { 00:15:53.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.755 "dma_device_type": 2 00:15:53.755 } 00:15:53.755 ], 00:15:53.755 "driver_specific": {} 00:15:53.755 }' 00:15:53.755 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.755 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.755 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.755 22:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.755 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.755 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.755 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:54.014 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:54.273 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:54.273 "name": "BaseBdev2", 00:15:54.273 "aliases": [ 00:15:54.273 "5a449f96-a890-4ee9-b255-579c12499022" 00:15:54.273 ], 00:15:54.273 "product_name": "Malloc disk", 00:15:54.273 "block_size": 512, 00:15:54.273 "num_blocks": 65536, 00:15:54.273 "uuid": "5a449f96-a890-4ee9-b255-579c12499022", 00:15:54.273 "assigned_rate_limits": { 00:15:54.273 "rw_ios_per_sec": 0, 00:15:54.273 "rw_mbytes_per_sec": 0, 00:15:54.273 "r_mbytes_per_sec": 0, 00:15:54.273 "w_mbytes_per_sec": 0 00:15:54.273 }, 00:15:54.273 "claimed": true, 00:15:54.273 "claim_type": "exclusive_write", 00:15:54.273 "zoned": false, 00:15:54.273 "supported_io_types": { 00:15:54.273 "read": true, 00:15:54.273 "write": true, 00:15:54.273 "unmap": true, 00:15:54.273 "flush": true, 00:15:54.273 "reset": true, 00:15:54.273 "nvme_admin": false, 00:15:54.273 "nvme_io": false, 00:15:54.273 "nvme_io_md": false, 00:15:54.273 "write_zeroes": true, 00:15:54.273 "zcopy": true, 00:15:54.273 "get_zone_info": false, 00:15:54.273 "zone_management": false, 00:15:54.273 "zone_append": false, 00:15:54.273 "compare": false, 00:15:54.273 "compare_and_write": false, 00:15:54.273 "abort": true, 00:15:54.273 "seek_hole": false, 00:15:54.273 "seek_data": false, 00:15:54.273 "copy": true, 00:15:54.273 "nvme_iov_md": false 00:15:54.273 }, 00:15:54.273 "memory_domains": [ 00:15:54.273 { 00:15:54.273 "dma_device_id": "system", 00:15:54.273 "dma_device_type": 1 00:15:54.273 }, 00:15:54.273 { 00:15:54.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.273 "dma_device_type": 2 00:15:54.273 } 00:15:54.273 ], 00:15:54.273 "driver_specific": {} 00:15:54.273 }' 00:15:54.273 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.531 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.790 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.790 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.790 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.790 22:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.790 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.790 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.790 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:54.790 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:55.358 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:55.358 "name": "BaseBdev3", 00:15:55.358 "aliases": [ 00:15:55.358 "13f74e9e-4228-42b5-8b49-5032f9157bd0" 00:15:55.358 ], 00:15:55.358 "product_name": "Malloc disk", 00:15:55.358 "block_size": 512, 00:15:55.358 "num_blocks": 65536, 00:15:55.358 "uuid": "13f74e9e-4228-42b5-8b49-5032f9157bd0", 00:15:55.358 "assigned_rate_limits": { 00:15:55.358 "rw_ios_per_sec": 0, 00:15:55.358 "rw_mbytes_per_sec": 0, 00:15:55.358 "r_mbytes_per_sec": 0, 00:15:55.358 "w_mbytes_per_sec": 0 00:15:55.358 }, 00:15:55.358 "claimed": true, 00:15:55.358 "claim_type": "exclusive_write", 00:15:55.358 "zoned": false, 00:15:55.358 "supported_io_types": { 00:15:55.358 "read": true, 00:15:55.358 "write": true, 00:15:55.358 "unmap": true, 00:15:55.358 "flush": true, 00:15:55.358 "reset": true, 00:15:55.358 "nvme_admin": false, 00:15:55.358 "nvme_io": false, 00:15:55.358 "nvme_io_md": false, 00:15:55.358 "write_zeroes": true, 00:15:55.358 "zcopy": true, 00:15:55.358 "get_zone_info": false, 00:15:55.358 "zone_management": false, 00:15:55.358 "zone_append": false, 00:15:55.358 "compare": false, 00:15:55.358 "compare_and_write": false, 00:15:55.358 "abort": true, 00:15:55.358 "seek_hole": false, 00:15:55.358 "seek_data": false, 00:15:55.358 "copy": true, 00:15:55.358 "nvme_iov_md": false 00:15:55.358 }, 00:15:55.358 "memory_domains": [ 00:15:55.358 { 00:15:55.358 "dma_device_id": "system", 00:15:55.358 "dma_device_type": 1 00:15:55.358 }, 00:15:55.358 { 00:15:55.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.358 "dma_device_type": 2 00:15:55.358 } 00:15:55.358 ], 00:15:55.358 "driver_specific": {} 00:15:55.358 }' 00:15:55.359 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.359 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.359 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:55.359 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.359 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.617 22:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:55.876 [2024-07-12 22:23:06.112078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:55.876 [2024-07-12 22:23:06.112109] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.876 [2024-07-12 22:23:06.112167] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.876 [2024-07-12 22:23:06.112219] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:55.876 [2024-07-12 22:23:06.112231] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2b450 name Existed_Raid, state offline 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3454568 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3454568 ']' 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3454568 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3454568 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3454568' 00:15:55.876 killing process with pid 3454568 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3454568 00:15:55.876 [2024-07-12 22:23:06.184093] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:55.876 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3454568 00:15:56.135 [2024-07-12 22:23:06.211727] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:56.135 22:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:56.135 00:15:56.135 real 0m28.602s 00:15:56.135 user 0m52.592s 00:15:56.135 sys 0m5.027s 00:15:56.135 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:56.135 22:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.135 ************************************ 00:15:56.135 END TEST raid_state_function_test 00:15:56.135 ************************************ 00:15:56.394 22:23:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:56.394 22:23:06 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:56.394 22:23:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:56.394 22:23:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:56.394 22:23:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:56.394 ************************************ 00:15:56.394 START TEST raid_state_function_test_sb 00:15:56.394 ************************************ 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3458967 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3458967' 00:15:56.394 Process raid pid: 3458967 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3458967 /var/tmp/spdk-raid.sock 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3458967 ']' 00:15:56.394 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:56.395 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:56.395 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:56.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:56.395 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:56.395 22:23:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.395 [2024-07-12 22:23:06.598924] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:15:56.395 [2024-07-12 22:23:06.599010] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:56.653 [2024-07-12 22:23:06.731064] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:56.653 [2024-07-12 22:23:06.827791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:56.653 [2024-07-12 22:23:06.891674] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.653 [2024-07-12 22:23:06.891712] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.221 22:23:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:57.221 22:23:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:57.221 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:57.479 [2024-07-12 22:23:07.678399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:57.479 [2024-07-12 22:23:07.678441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:57.479 [2024-07-12 22:23:07.678453] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:57.479 [2024-07-12 22:23:07.678466] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:57.479 [2024-07-12 22:23:07.678475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:57.479 [2024-07-12 22:23:07.678486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.479 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.737 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.737 "name": "Existed_Raid", 00:15:57.737 "uuid": "122072ea-c7be-44e6-9409-b4c2b14cf92a", 00:15:57.737 "strip_size_kb": 64, 00:15:57.737 "state": "configuring", 00:15:57.737 "raid_level": "concat", 00:15:57.737 "superblock": true, 00:15:57.737 "num_base_bdevs": 3, 00:15:57.737 "num_base_bdevs_discovered": 0, 00:15:57.737 "num_base_bdevs_operational": 3, 00:15:57.737 "base_bdevs_list": [ 00:15:57.737 { 00:15:57.737 "name": "BaseBdev1", 00:15:57.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.737 "is_configured": false, 00:15:57.737 "data_offset": 0, 00:15:57.737 "data_size": 0 00:15:57.737 }, 00:15:57.737 { 00:15:57.737 "name": "BaseBdev2", 00:15:57.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.737 "is_configured": false, 00:15:57.737 "data_offset": 0, 00:15:57.737 "data_size": 0 00:15:57.737 }, 00:15:57.737 { 00:15:57.737 "name": "BaseBdev3", 00:15:57.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.737 "is_configured": false, 00:15:57.737 "data_offset": 0, 00:15:57.737 "data_size": 0 00:15:57.737 } 00:15:57.737 ] 00:15:57.737 }' 00:15:57.737 22:23:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.737 22:23:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.303 22:23:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:58.561 [2024-07-12 22:23:08.692914] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:58.561 [2024-07-12 22:23:08.692952] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2472a80 name Existed_Raid, state configuring 00:15:58.561 22:23:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:58.819 [2024-07-12 22:23:08.941614] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.819 [2024-07-12 22:23:08.941646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.819 [2024-07-12 22:23:08.941656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.819 [2024-07-12 22:23:08.941668] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.819 [2024-07-12 22:23:08.941677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.819 [2024-07-12 22:23:08.941687] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.819 22:23:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:59.077 [2024-07-12 22:23:09.192202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:59.077 BaseBdev1 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:59.077 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.335 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:59.593 [ 00:15:59.593 { 00:15:59.593 "name": "BaseBdev1", 00:15:59.593 "aliases": [ 00:15:59.593 "e77287ed-a026-4e17-80a0-063e10eb4ed4" 00:15:59.593 ], 00:15:59.593 "product_name": "Malloc disk", 00:15:59.593 "block_size": 512, 00:15:59.593 "num_blocks": 65536, 00:15:59.593 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:15:59.593 "assigned_rate_limits": { 00:15:59.593 "rw_ios_per_sec": 0, 00:15:59.593 "rw_mbytes_per_sec": 0, 00:15:59.593 "r_mbytes_per_sec": 0, 00:15:59.593 "w_mbytes_per_sec": 0 00:15:59.593 }, 00:15:59.593 "claimed": true, 00:15:59.593 "claim_type": "exclusive_write", 00:15:59.593 "zoned": false, 00:15:59.593 "supported_io_types": { 00:15:59.593 "read": true, 00:15:59.593 "write": true, 00:15:59.593 "unmap": true, 00:15:59.593 "flush": true, 00:15:59.593 "reset": true, 00:15:59.593 "nvme_admin": false, 00:15:59.593 "nvme_io": false, 00:15:59.593 "nvme_io_md": false, 00:15:59.593 "write_zeroes": true, 00:15:59.593 "zcopy": true, 00:15:59.593 "get_zone_info": false, 00:15:59.593 "zone_management": false, 00:15:59.593 "zone_append": false, 00:15:59.593 "compare": false, 00:15:59.593 "compare_and_write": false, 00:15:59.593 "abort": true, 00:15:59.593 "seek_hole": false, 00:15:59.593 "seek_data": false, 00:15:59.593 "copy": true, 00:15:59.593 "nvme_iov_md": false 00:15:59.593 }, 00:15:59.593 "memory_domains": [ 00:15:59.593 { 00:15:59.593 "dma_device_id": "system", 00:15:59.593 "dma_device_type": 1 00:15:59.593 }, 00:15:59.593 { 00:15:59.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.593 "dma_device_type": 2 00:15:59.593 } 00:15:59.593 ], 00:15:59.593 "driver_specific": {} 00:15:59.593 } 00:15:59.593 ] 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.593 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.851 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.851 "name": "Existed_Raid", 00:15:59.851 "uuid": "35f92b97-b203-4612-a257-26d0f0a93060", 00:15:59.851 "strip_size_kb": 64, 00:15:59.851 "state": "configuring", 00:15:59.851 "raid_level": "concat", 00:15:59.851 "superblock": true, 00:15:59.851 "num_base_bdevs": 3, 00:15:59.851 "num_base_bdevs_discovered": 1, 00:15:59.851 "num_base_bdevs_operational": 3, 00:15:59.851 "base_bdevs_list": [ 00:15:59.851 { 00:15:59.851 "name": "BaseBdev1", 00:15:59.851 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:15:59.851 "is_configured": true, 00:15:59.851 "data_offset": 2048, 00:15:59.851 "data_size": 63488 00:15:59.851 }, 00:15:59.851 { 00:15:59.851 "name": "BaseBdev2", 00:15:59.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.851 "is_configured": false, 00:15:59.851 "data_offset": 0, 00:15:59.851 "data_size": 0 00:15:59.851 }, 00:15:59.851 { 00:15:59.851 "name": "BaseBdev3", 00:15:59.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.851 "is_configured": false, 00:15:59.851 "data_offset": 0, 00:15:59.851 "data_size": 0 00:15:59.851 } 00:15:59.851 ] 00:15:59.851 }' 00:15:59.851 22:23:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.851 22:23:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.418 22:23:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:00.677 [2024-07-12 22:23:10.768380] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:00.677 [2024-07-12 22:23:10.768428] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2472310 name Existed_Raid, state configuring 00:16:00.677 22:23:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:00.936 [2024-07-12 22:23:11.013066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.936 [2024-07-12 22:23:11.014507] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:00.936 [2024-07-12 22:23:11.014539] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:00.936 [2024-07-12 22:23:11.014555] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:00.936 [2024-07-12 22:23:11.014567] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.936 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.937 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.196 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.196 "name": "Existed_Raid", 00:16:01.196 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:01.196 "strip_size_kb": 64, 00:16:01.196 "state": "configuring", 00:16:01.196 "raid_level": "concat", 00:16:01.196 "superblock": true, 00:16:01.196 "num_base_bdevs": 3, 00:16:01.196 "num_base_bdevs_discovered": 1, 00:16:01.196 "num_base_bdevs_operational": 3, 00:16:01.196 "base_bdevs_list": [ 00:16:01.196 { 00:16:01.196 "name": "BaseBdev1", 00:16:01.196 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:16:01.196 "is_configured": true, 00:16:01.196 "data_offset": 2048, 00:16:01.196 "data_size": 63488 00:16:01.196 }, 00:16:01.196 { 00:16:01.196 "name": "BaseBdev2", 00:16:01.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.196 "is_configured": false, 00:16:01.196 "data_offset": 0, 00:16:01.196 "data_size": 0 00:16:01.196 }, 00:16:01.196 { 00:16:01.196 "name": "BaseBdev3", 00:16:01.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.196 "is_configured": false, 00:16:01.196 "data_offset": 0, 00:16:01.196 "data_size": 0 00:16:01.196 } 00:16:01.196 ] 00:16:01.196 }' 00:16:01.196 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.196 22:23:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.765 22:23:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:02.024 [2024-07-12 22:23:12.099304] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.024 BaseBdev2 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:02.024 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:02.284 [ 00:16:02.284 { 00:16:02.284 "name": "BaseBdev2", 00:16:02.284 "aliases": [ 00:16:02.284 "ff0eded4-df37-493f-80d8-d35f3e3cdfce" 00:16:02.284 ], 00:16:02.284 "product_name": "Malloc disk", 00:16:02.284 "block_size": 512, 00:16:02.284 "num_blocks": 65536, 00:16:02.284 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:02.284 "assigned_rate_limits": { 00:16:02.284 "rw_ios_per_sec": 0, 00:16:02.284 "rw_mbytes_per_sec": 0, 00:16:02.284 "r_mbytes_per_sec": 0, 00:16:02.284 "w_mbytes_per_sec": 0 00:16:02.284 }, 00:16:02.284 "claimed": true, 00:16:02.284 "claim_type": "exclusive_write", 00:16:02.284 "zoned": false, 00:16:02.284 "supported_io_types": { 00:16:02.284 "read": true, 00:16:02.284 "write": true, 00:16:02.284 "unmap": true, 00:16:02.284 "flush": true, 00:16:02.284 "reset": true, 00:16:02.284 "nvme_admin": false, 00:16:02.284 "nvme_io": false, 00:16:02.284 "nvme_io_md": false, 00:16:02.284 "write_zeroes": true, 00:16:02.284 "zcopy": true, 00:16:02.284 "get_zone_info": false, 00:16:02.284 "zone_management": false, 00:16:02.284 "zone_append": false, 00:16:02.284 "compare": false, 00:16:02.284 "compare_and_write": false, 00:16:02.284 "abort": true, 00:16:02.284 "seek_hole": false, 00:16:02.284 "seek_data": false, 00:16:02.284 "copy": true, 00:16:02.284 "nvme_iov_md": false 00:16:02.284 }, 00:16:02.284 "memory_domains": [ 00:16:02.284 { 00:16:02.284 "dma_device_id": "system", 00:16:02.284 "dma_device_type": 1 00:16:02.284 }, 00:16:02.284 { 00:16:02.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.284 "dma_device_type": 2 00:16:02.284 } 00:16:02.284 ], 00:16:02.284 "driver_specific": {} 00:16:02.284 } 00:16:02.284 ] 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.284 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.543 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.543 "name": "Existed_Raid", 00:16:02.544 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:02.544 "strip_size_kb": 64, 00:16:02.544 "state": "configuring", 00:16:02.544 "raid_level": "concat", 00:16:02.544 "superblock": true, 00:16:02.544 "num_base_bdevs": 3, 00:16:02.544 "num_base_bdevs_discovered": 2, 00:16:02.544 "num_base_bdevs_operational": 3, 00:16:02.544 "base_bdevs_list": [ 00:16:02.544 { 00:16:02.544 "name": "BaseBdev1", 00:16:02.544 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:16:02.544 "is_configured": true, 00:16:02.544 "data_offset": 2048, 00:16:02.544 "data_size": 63488 00:16:02.544 }, 00:16:02.544 { 00:16:02.544 "name": "BaseBdev2", 00:16:02.544 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:02.544 "is_configured": true, 00:16:02.544 "data_offset": 2048, 00:16:02.544 "data_size": 63488 00:16:02.544 }, 00:16:02.544 { 00:16:02.544 "name": "BaseBdev3", 00:16:02.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.544 "is_configured": false, 00:16:02.544 "data_offset": 0, 00:16:02.544 "data_size": 0 00:16:02.544 } 00:16:02.544 ] 00:16:02.544 }' 00:16:02.544 22:23:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.544 22:23:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.111 22:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:03.369 [2024-07-12 22:23:13.662957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.369 [2024-07-12 22:23:13.663126] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2473400 00:16:03.369 [2024-07-12 22:23:13.663140] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:03.369 [2024-07-12 22:23:13.663318] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2472ef0 00:16:03.369 [2024-07-12 22:23:13.663436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2473400 00:16:03.369 [2024-07-12 22:23:13.663446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2473400 00:16:03.369 [2024-07-12 22:23:13.663538] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.369 BaseBdev3 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.369 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.628 22:23:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:03.887 [ 00:16:03.888 { 00:16:03.888 "name": "BaseBdev3", 00:16:03.888 "aliases": [ 00:16:03.888 "123bd0a3-670f-42df-8d98-f877b3b9c6df" 00:16:03.888 ], 00:16:03.888 "product_name": "Malloc disk", 00:16:03.888 "block_size": 512, 00:16:03.888 "num_blocks": 65536, 00:16:03.888 "uuid": "123bd0a3-670f-42df-8d98-f877b3b9c6df", 00:16:03.888 "assigned_rate_limits": { 00:16:03.888 "rw_ios_per_sec": 0, 00:16:03.888 "rw_mbytes_per_sec": 0, 00:16:03.888 "r_mbytes_per_sec": 0, 00:16:03.888 "w_mbytes_per_sec": 0 00:16:03.888 }, 00:16:03.888 "claimed": true, 00:16:03.888 "claim_type": "exclusive_write", 00:16:03.888 "zoned": false, 00:16:03.888 "supported_io_types": { 00:16:03.888 "read": true, 00:16:03.888 "write": true, 00:16:03.888 "unmap": true, 00:16:03.888 "flush": true, 00:16:03.888 "reset": true, 00:16:03.888 "nvme_admin": false, 00:16:03.888 "nvme_io": false, 00:16:03.888 "nvme_io_md": false, 00:16:03.888 "write_zeroes": true, 00:16:03.888 "zcopy": true, 00:16:03.888 "get_zone_info": false, 00:16:03.888 "zone_management": false, 00:16:03.888 "zone_append": false, 00:16:03.888 "compare": false, 00:16:03.888 "compare_and_write": false, 00:16:03.888 "abort": true, 00:16:03.888 "seek_hole": false, 00:16:03.888 "seek_data": false, 00:16:03.888 "copy": true, 00:16:03.888 "nvme_iov_md": false 00:16:03.888 }, 00:16:03.888 "memory_domains": [ 00:16:03.888 { 00:16:03.888 "dma_device_id": "system", 00:16:03.888 "dma_device_type": 1 00:16:03.888 }, 00:16:03.888 { 00:16:03.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.888 "dma_device_type": 2 00:16:03.888 } 00:16:03.888 ], 00:16:03.888 "driver_specific": {} 00:16:03.888 } 00:16:03.888 ] 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.888 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.147 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.147 "name": "Existed_Raid", 00:16:04.147 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:04.147 "strip_size_kb": 64, 00:16:04.147 "state": "online", 00:16:04.147 "raid_level": "concat", 00:16:04.147 "superblock": true, 00:16:04.147 "num_base_bdevs": 3, 00:16:04.147 "num_base_bdevs_discovered": 3, 00:16:04.147 "num_base_bdevs_operational": 3, 00:16:04.147 "base_bdevs_list": [ 00:16:04.147 { 00:16:04.147 "name": "BaseBdev1", 00:16:04.147 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:16:04.147 "is_configured": true, 00:16:04.147 "data_offset": 2048, 00:16:04.147 "data_size": 63488 00:16:04.147 }, 00:16:04.147 { 00:16:04.147 "name": "BaseBdev2", 00:16:04.147 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:04.147 "is_configured": true, 00:16:04.147 "data_offset": 2048, 00:16:04.147 "data_size": 63488 00:16:04.147 }, 00:16:04.147 { 00:16:04.147 "name": "BaseBdev3", 00:16:04.147 "uuid": "123bd0a3-670f-42df-8d98-f877b3b9c6df", 00:16:04.147 "is_configured": true, 00:16:04.147 "data_offset": 2048, 00:16:04.147 "data_size": 63488 00:16:04.147 } 00:16:04.147 ] 00:16:04.147 }' 00:16:04.147 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.147 22:23:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:04.717 22:23:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:04.717 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:04.717 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:04.977 [2024-07-12 22:23:15.155248] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:04.977 "name": "Existed_Raid", 00:16:04.977 "aliases": [ 00:16:04.977 "4e6a20f3-75b5-4990-b937-ea172245429b" 00:16:04.977 ], 00:16:04.977 "product_name": "Raid Volume", 00:16:04.977 "block_size": 512, 00:16:04.977 "num_blocks": 190464, 00:16:04.977 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:04.977 "assigned_rate_limits": { 00:16:04.977 "rw_ios_per_sec": 0, 00:16:04.977 "rw_mbytes_per_sec": 0, 00:16:04.977 "r_mbytes_per_sec": 0, 00:16:04.977 "w_mbytes_per_sec": 0 00:16:04.977 }, 00:16:04.977 "claimed": false, 00:16:04.977 "zoned": false, 00:16:04.977 "supported_io_types": { 00:16:04.977 "read": true, 00:16:04.977 "write": true, 00:16:04.977 "unmap": true, 00:16:04.977 "flush": true, 00:16:04.977 "reset": true, 00:16:04.977 "nvme_admin": false, 00:16:04.977 "nvme_io": false, 00:16:04.977 "nvme_io_md": false, 00:16:04.977 "write_zeroes": true, 00:16:04.977 "zcopy": false, 00:16:04.977 "get_zone_info": false, 00:16:04.977 "zone_management": false, 00:16:04.977 "zone_append": false, 00:16:04.977 "compare": false, 00:16:04.977 "compare_and_write": false, 00:16:04.977 "abort": false, 00:16:04.977 "seek_hole": false, 00:16:04.977 "seek_data": false, 00:16:04.977 "copy": false, 00:16:04.977 "nvme_iov_md": false 00:16:04.977 }, 00:16:04.977 "memory_domains": [ 00:16:04.977 { 00:16:04.977 "dma_device_id": "system", 00:16:04.977 "dma_device_type": 1 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.977 "dma_device_type": 2 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "dma_device_id": "system", 00:16:04.977 "dma_device_type": 1 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.977 "dma_device_type": 2 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "dma_device_id": "system", 00:16:04.977 "dma_device_type": 1 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.977 "dma_device_type": 2 00:16:04.977 } 00:16:04.977 ], 00:16:04.977 "driver_specific": { 00:16:04.977 "raid": { 00:16:04.977 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:04.977 "strip_size_kb": 64, 00:16:04.977 "state": "online", 00:16:04.977 "raid_level": "concat", 00:16:04.977 "superblock": true, 00:16:04.977 "num_base_bdevs": 3, 00:16:04.977 "num_base_bdevs_discovered": 3, 00:16:04.977 "num_base_bdevs_operational": 3, 00:16:04.977 "base_bdevs_list": [ 00:16:04.977 { 00:16:04.977 "name": "BaseBdev1", 00:16:04.977 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:16:04.977 "is_configured": true, 00:16:04.977 "data_offset": 2048, 00:16:04.977 "data_size": 63488 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "name": "BaseBdev2", 00:16:04.977 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:04.977 "is_configured": true, 00:16:04.977 "data_offset": 2048, 00:16:04.977 "data_size": 63488 00:16:04.977 }, 00:16:04.977 { 00:16:04.977 "name": "BaseBdev3", 00:16:04.977 "uuid": "123bd0a3-670f-42df-8d98-f877b3b9c6df", 00:16:04.977 "is_configured": true, 00:16:04.977 "data_offset": 2048, 00:16:04.977 "data_size": 63488 00:16:04.977 } 00:16:04.977 ] 00:16:04.977 } 00:16:04.977 } 00:16:04.977 }' 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:04.977 BaseBdev2 00:16:04.977 BaseBdev3' 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:04.977 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.236 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.236 "name": "BaseBdev1", 00:16:05.236 "aliases": [ 00:16:05.236 "e77287ed-a026-4e17-80a0-063e10eb4ed4" 00:16:05.236 ], 00:16:05.236 "product_name": "Malloc disk", 00:16:05.236 "block_size": 512, 00:16:05.236 "num_blocks": 65536, 00:16:05.236 "uuid": "e77287ed-a026-4e17-80a0-063e10eb4ed4", 00:16:05.236 "assigned_rate_limits": { 00:16:05.236 "rw_ios_per_sec": 0, 00:16:05.236 "rw_mbytes_per_sec": 0, 00:16:05.236 "r_mbytes_per_sec": 0, 00:16:05.236 "w_mbytes_per_sec": 0 00:16:05.236 }, 00:16:05.236 "claimed": true, 00:16:05.236 "claim_type": "exclusive_write", 00:16:05.236 "zoned": false, 00:16:05.236 "supported_io_types": { 00:16:05.236 "read": true, 00:16:05.236 "write": true, 00:16:05.236 "unmap": true, 00:16:05.236 "flush": true, 00:16:05.236 "reset": true, 00:16:05.236 "nvme_admin": false, 00:16:05.236 "nvme_io": false, 00:16:05.236 "nvme_io_md": false, 00:16:05.236 "write_zeroes": true, 00:16:05.236 "zcopy": true, 00:16:05.236 "get_zone_info": false, 00:16:05.236 "zone_management": false, 00:16:05.236 "zone_append": false, 00:16:05.236 "compare": false, 00:16:05.236 "compare_and_write": false, 00:16:05.236 "abort": true, 00:16:05.236 "seek_hole": false, 00:16:05.236 "seek_data": false, 00:16:05.236 "copy": true, 00:16:05.236 "nvme_iov_md": false 00:16:05.236 }, 00:16:05.236 "memory_domains": [ 00:16:05.236 { 00:16:05.236 "dma_device_id": "system", 00:16:05.236 "dma_device_type": 1 00:16:05.236 }, 00:16:05.236 { 00:16:05.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.236 "dma_device_type": 2 00:16:05.236 } 00:16:05.236 ], 00:16:05.236 "driver_specific": {} 00:16:05.236 }' 00:16:05.236 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.236 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.236 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.236 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:05.495 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.755 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.755 "name": "BaseBdev2", 00:16:05.755 "aliases": [ 00:16:05.755 "ff0eded4-df37-493f-80d8-d35f3e3cdfce" 00:16:05.755 ], 00:16:05.755 "product_name": "Malloc disk", 00:16:05.755 "block_size": 512, 00:16:05.755 "num_blocks": 65536, 00:16:05.755 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:05.755 "assigned_rate_limits": { 00:16:05.755 "rw_ios_per_sec": 0, 00:16:05.755 "rw_mbytes_per_sec": 0, 00:16:05.755 "r_mbytes_per_sec": 0, 00:16:05.755 "w_mbytes_per_sec": 0 00:16:05.755 }, 00:16:05.755 "claimed": true, 00:16:05.755 "claim_type": "exclusive_write", 00:16:05.755 "zoned": false, 00:16:05.755 "supported_io_types": { 00:16:05.755 "read": true, 00:16:05.755 "write": true, 00:16:05.755 "unmap": true, 00:16:05.755 "flush": true, 00:16:05.755 "reset": true, 00:16:05.755 "nvme_admin": false, 00:16:05.755 "nvme_io": false, 00:16:05.755 "nvme_io_md": false, 00:16:05.755 "write_zeroes": true, 00:16:05.755 "zcopy": true, 00:16:05.755 "get_zone_info": false, 00:16:05.755 "zone_management": false, 00:16:05.755 "zone_append": false, 00:16:05.755 "compare": false, 00:16:05.755 "compare_and_write": false, 00:16:05.755 "abort": true, 00:16:05.755 "seek_hole": false, 00:16:05.755 "seek_data": false, 00:16:05.755 "copy": true, 00:16:05.755 "nvme_iov_md": false 00:16:05.755 }, 00:16:05.755 "memory_domains": [ 00:16:05.755 { 00:16:05.755 "dma_device_id": "system", 00:16:05.755 "dma_device_type": 1 00:16:05.755 }, 00:16:05.755 { 00:16:05.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.755 "dma_device_type": 2 00:16:05.755 } 00:16:05.755 ], 00:16:05.755 "driver_specific": {} 00:16:05.755 }' 00:16:05.755 22:23:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.755 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.014 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.274 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.274 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.274 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:06.274 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.274 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.274 "name": "BaseBdev3", 00:16:06.274 "aliases": [ 00:16:06.274 "123bd0a3-670f-42df-8d98-f877b3b9c6df" 00:16:06.274 ], 00:16:06.274 "product_name": "Malloc disk", 00:16:06.274 "block_size": 512, 00:16:06.274 "num_blocks": 65536, 00:16:06.274 "uuid": "123bd0a3-670f-42df-8d98-f877b3b9c6df", 00:16:06.274 "assigned_rate_limits": { 00:16:06.274 "rw_ios_per_sec": 0, 00:16:06.274 "rw_mbytes_per_sec": 0, 00:16:06.274 "r_mbytes_per_sec": 0, 00:16:06.274 "w_mbytes_per_sec": 0 00:16:06.274 }, 00:16:06.274 "claimed": true, 00:16:06.274 "claim_type": "exclusive_write", 00:16:06.274 "zoned": false, 00:16:06.274 "supported_io_types": { 00:16:06.274 "read": true, 00:16:06.274 "write": true, 00:16:06.274 "unmap": true, 00:16:06.274 "flush": true, 00:16:06.274 "reset": true, 00:16:06.274 "nvme_admin": false, 00:16:06.274 "nvme_io": false, 00:16:06.274 "nvme_io_md": false, 00:16:06.274 "write_zeroes": true, 00:16:06.274 "zcopy": true, 00:16:06.274 "get_zone_info": false, 00:16:06.274 "zone_management": false, 00:16:06.274 "zone_append": false, 00:16:06.274 "compare": false, 00:16:06.274 "compare_and_write": false, 00:16:06.274 "abort": true, 00:16:06.274 "seek_hole": false, 00:16:06.274 "seek_data": false, 00:16:06.274 "copy": true, 00:16:06.274 "nvme_iov_md": false 00:16:06.274 }, 00:16:06.274 "memory_domains": [ 00:16:06.274 { 00:16:06.274 "dma_device_id": "system", 00:16:06.274 "dma_device_type": 1 00:16:06.274 }, 00:16:06.274 { 00:16:06.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.275 "dma_device_type": 2 00:16:06.275 } 00:16:06.275 ], 00:16:06.275 "driver_specific": {} 00:16:06.275 }' 00:16:06.275 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.534 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.793 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.793 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.793 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.793 22:23:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:07.122 [2024-07-12 22:23:17.180397] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:07.122 [2024-07-12 22:23:17.180425] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:07.122 [2024-07-12 22:23:17.180468] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.122 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.425 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.425 "name": "Existed_Raid", 00:16:07.425 "uuid": "4e6a20f3-75b5-4990-b937-ea172245429b", 00:16:07.425 "strip_size_kb": 64, 00:16:07.425 "state": "offline", 00:16:07.425 "raid_level": "concat", 00:16:07.425 "superblock": true, 00:16:07.425 "num_base_bdevs": 3, 00:16:07.425 "num_base_bdevs_discovered": 2, 00:16:07.425 "num_base_bdevs_operational": 2, 00:16:07.425 "base_bdevs_list": [ 00:16:07.425 { 00:16:07.425 "name": null, 00:16:07.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.425 "is_configured": false, 00:16:07.425 "data_offset": 2048, 00:16:07.425 "data_size": 63488 00:16:07.425 }, 00:16:07.425 { 00:16:07.425 "name": "BaseBdev2", 00:16:07.425 "uuid": "ff0eded4-df37-493f-80d8-d35f3e3cdfce", 00:16:07.425 "is_configured": true, 00:16:07.425 "data_offset": 2048, 00:16:07.425 "data_size": 63488 00:16:07.425 }, 00:16:07.425 { 00:16:07.425 "name": "BaseBdev3", 00:16:07.425 "uuid": "123bd0a3-670f-42df-8d98-f877b3b9c6df", 00:16:07.425 "is_configured": true, 00:16:07.425 "data_offset": 2048, 00:16:07.425 "data_size": 63488 00:16:07.425 } 00:16:07.425 ] 00:16:07.425 }' 00:16:07.425 22:23:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.425 22:23:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:07.994 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:08.294 [2024-07-12 22:23:18.525004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:08.294 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:08.294 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:08.294 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.294 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:08.553 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:08.553 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:08.553 22:23:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:08.812 [2024-07-12 22:23:19.041075] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:08.812 [2024-07-12 22:23:19.041120] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2473400 name Existed_Raid, state offline 00:16:08.812 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:08.812 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:08.812 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.812 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:09.071 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:09.329 BaseBdev2 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:09.329 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.588 22:23:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:09.847 [ 00:16:09.847 { 00:16:09.847 "name": "BaseBdev2", 00:16:09.847 "aliases": [ 00:16:09.847 "26697972-fa17-4533-8cd2-ce1a427f0863" 00:16:09.847 ], 00:16:09.847 "product_name": "Malloc disk", 00:16:09.847 "block_size": 512, 00:16:09.847 "num_blocks": 65536, 00:16:09.847 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:09.847 "assigned_rate_limits": { 00:16:09.847 "rw_ios_per_sec": 0, 00:16:09.847 "rw_mbytes_per_sec": 0, 00:16:09.847 "r_mbytes_per_sec": 0, 00:16:09.847 "w_mbytes_per_sec": 0 00:16:09.847 }, 00:16:09.847 "claimed": false, 00:16:09.847 "zoned": false, 00:16:09.847 "supported_io_types": { 00:16:09.847 "read": true, 00:16:09.847 "write": true, 00:16:09.847 "unmap": true, 00:16:09.847 "flush": true, 00:16:09.847 "reset": true, 00:16:09.847 "nvme_admin": false, 00:16:09.847 "nvme_io": false, 00:16:09.847 "nvme_io_md": false, 00:16:09.847 "write_zeroes": true, 00:16:09.847 "zcopy": true, 00:16:09.847 "get_zone_info": false, 00:16:09.847 "zone_management": false, 00:16:09.847 "zone_append": false, 00:16:09.847 "compare": false, 00:16:09.847 "compare_and_write": false, 00:16:09.847 "abort": true, 00:16:09.847 "seek_hole": false, 00:16:09.847 "seek_data": false, 00:16:09.847 "copy": true, 00:16:09.847 "nvme_iov_md": false 00:16:09.847 }, 00:16:09.847 "memory_domains": [ 00:16:09.847 { 00:16:09.847 "dma_device_id": "system", 00:16:09.847 "dma_device_type": 1 00:16:09.847 }, 00:16:09.847 { 00:16:09.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.847 "dma_device_type": 2 00:16:09.847 } 00:16:09.847 ], 00:16:09.847 "driver_specific": {} 00:16:09.847 } 00:16:09.847 ] 00:16:09.847 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:09.847 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:09.847 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:09.847 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:10.105 BaseBdev3 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:10.106 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.364 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:10.623 [ 00:16:10.623 { 00:16:10.623 "name": "BaseBdev3", 00:16:10.623 "aliases": [ 00:16:10.623 "c542222a-b616-4b3e-935d-70069d13c8bd" 00:16:10.623 ], 00:16:10.623 "product_name": "Malloc disk", 00:16:10.623 "block_size": 512, 00:16:10.623 "num_blocks": 65536, 00:16:10.623 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:10.623 "assigned_rate_limits": { 00:16:10.623 "rw_ios_per_sec": 0, 00:16:10.623 "rw_mbytes_per_sec": 0, 00:16:10.623 "r_mbytes_per_sec": 0, 00:16:10.623 "w_mbytes_per_sec": 0 00:16:10.623 }, 00:16:10.623 "claimed": false, 00:16:10.623 "zoned": false, 00:16:10.623 "supported_io_types": { 00:16:10.623 "read": true, 00:16:10.623 "write": true, 00:16:10.623 "unmap": true, 00:16:10.623 "flush": true, 00:16:10.623 "reset": true, 00:16:10.623 "nvme_admin": false, 00:16:10.623 "nvme_io": false, 00:16:10.623 "nvme_io_md": false, 00:16:10.623 "write_zeroes": true, 00:16:10.623 "zcopy": true, 00:16:10.623 "get_zone_info": false, 00:16:10.623 "zone_management": false, 00:16:10.623 "zone_append": false, 00:16:10.623 "compare": false, 00:16:10.623 "compare_and_write": false, 00:16:10.623 "abort": true, 00:16:10.623 "seek_hole": false, 00:16:10.623 "seek_data": false, 00:16:10.623 "copy": true, 00:16:10.623 "nvme_iov_md": false 00:16:10.623 }, 00:16:10.623 "memory_domains": [ 00:16:10.623 { 00:16:10.623 "dma_device_id": "system", 00:16:10.623 "dma_device_type": 1 00:16:10.623 }, 00:16:10.623 { 00:16:10.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.623 "dma_device_type": 2 00:16:10.623 } 00:16:10.623 ], 00:16:10.623 "driver_specific": {} 00:16:10.623 } 00:16:10.623 ] 00:16:10.623 22:23:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:10.624 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:10.624 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.624 22:23:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:10.883 [2024-07-12 22:23:21.053875] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:10.883 [2024-07-12 22:23:21.053918] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:10.883 [2024-07-12 22:23:21.053944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:10.883 [2024-07-12 22:23:21.055333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.883 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.143 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.143 "name": "Existed_Raid", 00:16:11.143 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:11.143 "strip_size_kb": 64, 00:16:11.143 "state": "configuring", 00:16:11.143 "raid_level": "concat", 00:16:11.143 "superblock": true, 00:16:11.143 "num_base_bdevs": 3, 00:16:11.143 "num_base_bdevs_discovered": 2, 00:16:11.143 "num_base_bdevs_operational": 3, 00:16:11.143 "base_bdevs_list": [ 00:16:11.143 { 00:16:11.143 "name": "BaseBdev1", 00:16:11.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.143 "is_configured": false, 00:16:11.143 "data_offset": 0, 00:16:11.143 "data_size": 0 00:16:11.143 }, 00:16:11.143 { 00:16:11.143 "name": "BaseBdev2", 00:16:11.143 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:11.143 "is_configured": true, 00:16:11.143 "data_offset": 2048, 00:16:11.143 "data_size": 63488 00:16:11.143 }, 00:16:11.143 { 00:16:11.143 "name": "BaseBdev3", 00:16:11.143 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:11.143 "is_configured": true, 00:16:11.143 "data_offset": 2048, 00:16:11.143 "data_size": 63488 00:16:11.143 } 00:16:11.143 ] 00:16:11.143 }' 00:16:11.143 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.143 22:23:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.711 22:23:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:11.970 [2024-07-12 22:23:22.120682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.970 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.229 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.229 "name": "Existed_Raid", 00:16:12.229 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:12.229 "strip_size_kb": 64, 00:16:12.229 "state": "configuring", 00:16:12.229 "raid_level": "concat", 00:16:12.229 "superblock": true, 00:16:12.229 "num_base_bdevs": 3, 00:16:12.229 "num_base_bdevs_discovered": 1, 00:16:12.229 "num_base_bdevs_operational": 3, 00:16:12.229 "base_bdevs_list": [ 00:16:12.229 { 00:16:12.229 "name": "BaseBdev1", 00:16:12.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.229 "is_configured": false, 00:16:12.229 "data_offset": 0, 00:16:12.229 "data_size": 0 00:16:12.229 }, 00:16:12.229 { 00:16:12.229 "name": null, 00:16:12.229 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:12.229 "is_configured": false, 00:16:12.229 "data_offset": 2048, 00:16:12.229 "data_size": 63488 00:16:12.229 }, 00:16:12.229 { 00:16:12.229 "name": "BaseBdev3", 00:16:12.229 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:12.229 "is_configured": true, 00:16:12.229 "data_offset": 2048, 00:16:12.229 "data_size": 63488 00:16:12.229 } 00:16:12.229 ] 00:16:12.229 }' 00:16:12.229 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.229 22:23:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:12.798 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.798 22:23:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:13.058 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:13.058 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:13.317 [2024-07-12 22:23:23.447947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:13.317 BaseBdev1 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:13.317 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.577 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:13.837 [ 00:16:13.837 { 00:16:13.837 "name": "BaseBdev1", 00:16:13.837 "aliases": [ 00:16:13.837 "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e" 00:16:13.837 ], 00:16:13.837 "product_name": "Malloc disk", 00:16:13.837 "block_size": 512, 00:16:13.837 "num_blocks": 65536, 00:16:13.837 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:13.837 "assigned_rate_limits": { 00:16:13.837 "rw_ios_per_sec": 0, 00:16:13.837 "rw_mbytes_per_sec": 0, 00:16:13.837 "r_mbytes_per_sec": 0, 00:16:13.837 "w_mbytes_per_sec": 0 00:16:13.837 }, 00:16:13.837 "claimed": true, 00:16:13.837 "claim_type": "exclusive_write", 00:16:13.837 "zoned": false, 00:16:13.837 "supported_io_types": { 00:16:13.837 "read": true, 00:16:13.837 "write": true, 00:16:13.837 "unmap": true, 00:16:13.837 "flush": true, 00:16:13.837 "reset": true, 00:16:13.837 "nvme_admin": false, 00:16:13.837 "nvme_io": false, 00:16:13.837 "nvme_io_md": false, 00:16:13.837 "write_zeroes": true, 00:16:13.837 "zcopy": true, 00:16:13.837 "get_zone_info": false, 00:16:13.837 "zone_management": false, 00:16:13.837 "zone_append": false, 00:16:13.837 "compare": false, 00:16:13.837 "compare_and_write": false, 00:16:13.837 "abort": true, 00:16:13.837 "seek_hole": false, 00:16:13.837 "seek_data": false, 00:16:13.837 "copy": true, 00:16:13.837 "nvme_iov_md": false 00:16:13.837 }, 00:16:13.837 "memory_domains": [ 00:16:13.837 { 00:16:13.837 "dma_device_id": "system", 00:16:13.837 "dma_device_type": 1 00:16:13.837 }, 00:16:13.837 { 00:16:13.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.837 "dma_device_type": 2 00:16:13.837 } 00:16:13.837 ], 00:16:13.837 "driver_specific": {} 00:16:13.837 } 00:16:13.837 ] 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.837 22:23:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.096 22:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.096 "name": "Existed_Raid", 00:16:14.096 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:14.096 "strip_size_kb": 64, 00:16:14.096 "state": "configuring", 00:16:14.096 "raid_level": "concat", 00:16:14.096 "superblock": true, 00:16:14.096 "num_base_bdevs": 3, 00:16:14.096 "num_base_bdevs_discovered": 2, 00:16:14.096 "num_base_bdevs_operational": 3, 00:16:14.096 "base_bdevs_list": [ 00:16:14.096 { 00:16:14.096 "name": "BaseBdev1", 00:16:14.096 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:14.096 "is_configured": true, 00:16:14.096 "data_offset": 2048, 00:16:14.096 "data_size": 63488 00:16:14.096 }, 00:16:14.096 { 00:16:14.096 "name": null, 00:16:14.096 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:14.096 "is_configured": false, 00:16:14.096 "data_offset": 2048, 00:16:14.096 "data_size": 63488 00:16:14.096 }, 00:16:14.096 { 00:16:14.096 "name": "BaseBdev3", 00:16:14.096 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:14.096 "is_configured": true, 00:16:14.096 "data_offset": 2048, 00:16:14.096 "data_size": 63488 00:16:14.096 } 00:16:14.096 ] 00:16:14.096 }' 00:16:14.096 22:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.096 22:23:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.665 22:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.665 22:23:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:14.924 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:14.924 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:14.924 [2024-07-12 22:23:25.248735] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.183 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.442 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.442 "name": "Existed_Raid", 00:16:15.442 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:15.442 "strip_size_kb": 64, 00:16:15.442 "state": "configuring", 00:16:15.442 "raid_level": "concat", 00:16:15.442 "superblock": true, 00:16:15.442 "num_base_bdevs": 3, 00:16:15.442 "num_base_bdevs_discovered": 1, 00:16:15.442 "num_base_bdevs_operational": 3, 00:16:15.442 "base_bdevs_list": [ 00:16:15.442 { 00:16:15.442 "name": "BaseBdev1", 00:16:15.442 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:15.442 "is_configured": true, 00:16:15.442 "data_offset": 2048, 00:16:15.442 "data_size": 63488 00:16:15.442 }, 00:16:15.442 { 00:16:15.442 "name": null, 00:16:15.442 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:15.442 "is_configured": false, 00:16:15.442 "data_offset": 2048, 00:16:15.442 "data_size": 63488 00:16:15.442 }, 00:16:15.442 { 00:16:15.442 "name": null, 00:16:15.442 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:15.442 "is_configured": false, 00:16:15.442 "data_offset": 2048, 00:16:15.442 "data_size": 63488 00:16:15.442 } 00:16:15.442 ] 00:16:15.442 }' 00:16:15.442 22:23:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.442 22:23:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.010 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.010 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:16.010 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:16.010 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:16.269 [2024-07-12 22:23:26.508097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.269 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.528 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.528 "name": "Existed_Raid", 00:16:16.528 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:16.528 "strip_size_kb": 64, 00:16:16.528 "state": "configuring", 00:16:16.528 "raid_level": "concat", 00:16:16.528 "superblock": true, 00:16:16.528 "num_base_bdevs": 3, 00:16:16.528 "num_base_bdevs_discovered": 2, 00:16:16.528 "num_base_bdevs_operational": 3, 00:16:16.528 "base_bdevs_list": [ 00:16:16.528 { 00:16:16.528 "name": "BaseBdev1", 00:16:16.528 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:16.528 "is_configured": true, 00:16:16.528 "data_offset": 2048, 00:16:16.528 "data_size": 63488 00:16:16.528 }, 00:16:16.528 { 00:16:16.528 "name": null, 00:16:16.528 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:16.528 "is_configured": false, 00:16:16.528 "data_offset": 2048, 00:16:16.528 "data_size": 63488 00:16:16.528 }, 00:16:16.528 { 00:16:16.528 "name": "BaseBdev3", 00:16:16.528 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:16.528 "is_configured": true, 00:16:16.528 "data_offset": 2048, 00:16:16.528 "data_size": 63488 00:16:16.528 } 00:16:16.528 ] 00:16:16.528 }' 00:16:16.528 22:23:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.528 22:23:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:17.097 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.097 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:17.355 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:17.355 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:17.615 [2024-07-12 22:23:27.807566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.615 22:23:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.875 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.875 "name": "Existed_Raid", 00:16:17.875 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:17.875 "strip_size_kb": 64, 00:16:17.875 "state": "configuring", 00:16:17.875 "raid_level": "concat", 00:16:17.875 "superblock": true, 00:16:17.875 "num_base_bdevs": 3, 00:16:17.875 "num_base_bdevs_discovered": 1, 00:16:17.875 "num_base_bdevs_operational": 3, 00:16:17.875 "base_bdevs_list": [ 00:16:17.875 { 00:16:17.875 "name": null, 00:16:17.875 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:17.875 "is_configured": false, 00:16:17.875 "data_offset": 2048, 00:16:17.875 "data_size": 63488 00:16:17.875 }, 00:16:17.875 { 00:16:17.875 "name": null, 00:16:17.875 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:17.875 "is_configured": false, 00:16:17.875 "data_offset": 2048, 00:16:17.875 "data_size": 63488 00:16:17.875 }, 00:16:17.875 { 00:16:17.875 "name": "BaseBdev3", 00:16:17.875 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:17.875 "is_configured": true, 00:16:17.875 "data_offset": 2048, 00:16:17.875 "data_size": 63488 00:16:17.875 } 00:16:17.875 ] 00:16:17.875 }' 00:16:17.875 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.875 22:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:18.442 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.442 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:18.700 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:18.700 22:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:18.959 [2024-07-12 22:23:29.147518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.959 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.219 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.219 "name": "Existed_Raid", 00:16:19.219 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:19.219 "strip_size_kb": 64, 00:16:19.219 "state": "configuring", 00:16:19.219 "raid_level": "concat", 00:16:19.219 "superblock": true, 00:16:19.219 "num_base_bdevs": 3, 00:16:19.219 "num_base_bdevs_discovered": 2, 00:16:19.219 "num_base_bdevs_operational": 3, 00:16:19.219 "base_bdevs_list": [ 00:16:19.219 { 00:16:19.219 "name": null, 00:16:19.219 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:19.219 "is_configured": false, 00:16:19.219 "data_offset": 2048, 00:16:19.219 "data_size": 63488 00:16:19.219 }, 00:16:19.219 { 00:16:19.219 "name": "BaseBdev2", 00:16:19.219 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:19.219 "is_configured": true, 00:16:19.219 "data_offset": 2048, 00:16:19.219 "data_size": 63488 00:16:19.219 }, 00:16:19.219 { 00:16:19.219 "name": "BaseBdev3", 00:16:19.219 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:19.219 "is_configured": true, 00:16:19.219 "data_offset": 2048, 00:16:19.219 "data_size": 63488 00:16:19.219 } 00:16:19.219 ] 00:16:19.219 }' 00:16:19.219 22:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.219 22:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:19.787 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.787 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:20.047 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:20.047 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.047 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:20.306 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8850f7b9-f66b-4cde-b657-d34b5d9a8d0e 00:16:20.565 [2024-07-12 22:23:30.747191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:20.565 [2024-07-12 22:23:30.747342] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2471f50 00:16:20.565 [2024-07-12 22:23:30.747355] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:20.565 [2024-07-12 22:23:30.747531] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2178940 00:16:20.565 [2024-07-12 22:23:30.747647] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2471f50 00:16:20.565 [2024-07-12 22:23:30.747657] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2471f50 00:16:20.565 [2024-07-12 22:23:30.747752] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:20.565 NewBaseBdev 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:20.565 22:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:20.824 22:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:21.083 [ 00:16:21.083 { 00:16:21.083 "name": "NewBaseBdev", 00:16:21.083 "aliases": [ 00:16:21.083 "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e" 00:16:21.083 ], 00:16:21.083 "product_name": "Malloc disk", 00:16:21.083 "block_size": 512, 00:16:21.083 "num_blocks": 65536, 00:16:21.083 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:21.083 "assigned_rate_limits": { 00:16:21.083 "rw_ios_per_sec": 0, 00:16:21.083 "rw_mbytes_per_sec": 0, 00:16:21.083 "r_mbytes_per_sec": 0, 00:16:21.083 "w_mbytes_per_sec": 0 00:16:21.083 }, 00:16:21.083 "claimed": true, 00:16:21.083 "claim_type": "exclusive_write", 00:16:21.083 "zoned": false, 00:16:21.083 "supported_io_types": { 00:16:21.083 "read": true, 00:16:21.083 "write": true, 00:16:21.083 "unmap": true, 00:16:21.083 "flush": true, 00:16:21.083 "reset": true, 00:16:21.083 "nvme_admin": false, 00:16:21.083 "nvme_io": false, 00:16:21.083 "nvme_io_md": false, 00:16:21.083 "write_zeroes": true, 00:16:21.083 "zcopy": true, 00:16:21.083 "get_zone_info": false, 00:16:21.083 "zone_management": false, 00:16:21.083 "zone_append": false, 00:16:21.083 "compare": false, 00:16:21.083 "compare_and_write": false, 00:16:21.083 "abort": true, 00:16:21.083 "seek_hole": false, 00:16:21.083 "seek_data": false, 00:16:21.083 "copy": true, 00:16:21.083 "nvme_iov_md": false 00:16:21.083 }, 00:16:21.083 "memory_domains": [ 00:16:21.083 { 00:16:21.083 "dma_device_id": "system", 00:16:21.083 "dma_device_type": 1 00:16:21.083 }, 00:16:21.083 { 00:16:21.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.083 "dma_device_type": 2 00:16:21.083 } 00:16:21.083 ], 00:16:21.083 "driver_specific": {} 00:16:21.083 } 00:16:21.083 ] 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.083 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.342 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.342 "name": "Existed_Raid", 00:16:21.342 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:21.342 "strip_size_kb": 64, 00:16:21.342 "state": "online", 00:16:21.342 "raid_level": "concat", 00:16:21.342 "superblock": true, 00:16:21.342 "num_base_bdevs": 3, 00:16:21.342 "num_base_bdevs_discovered": 3, 00:16:21.342 "num_base_bdevs_operational": 3, 00:16:21.342 "base_bdevs_list": [ 00:16:21.342 { 00:16:21.342 "name": "NewBaseBdev", 00:16:21.342 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:21.342 "is_configured": true, 00:16:21.342 "data_offset": 2048, 00:16:21.342 "data_size": 63488 00:16:21.342 }, 00:16:21.342 { 00:16:21.342 "name": "BaseBdev2", 00:16:21.342 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:21.342 "is_configured": true, 00:16:21.342 "data_offset": 2048, 00:16:21.342 "data_size": 63488 00:16:21.342 }, 00:16:21.342 { 00:16:21.342 "name": "BaseBdev3", 00:16:21.342 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:21.342 "is_configured": true, 00:16:21.342 "data_offset": 2048, 00:16:21.342 "data_size": 63488 00:16:21.342 } 00:16:21.342 ] 00:16:21.342 }' 00:16:21.342 22:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.342 22:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:21.906 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:22.165 [2024-07-12 22:23:32.311652] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:22.165 "name": "Existed_Raid", 00:16:22.165 "aliases": [ 00:16:22.165 "ff39b4ab-e334-4832-ba71-51110b262ee9" 00:16:22.165 ], 00:16:22.165 "product_name": "Raid Volume", 00:16:22.165 "block_size": 512, 00:16:22.165 "num_blocks": 190464, 00:16:22.165 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:22.165 "assigned_rate_limits": { 00:16:22.165 "rw_ios_per_sec": 0, 00:16:22.165 "rw_mbytes_per_sec": 0, 00:16:22.165 "r_mbytes_per_sec": 0, 00:16:22.165 "w_mbytes_per_sec": 0 00:16:22.165 }, 00:16:22.165 "claimed": false, 00:16:22.165 "zoned": false, 00:16:22.165 "supported_io_types": { 00:16:22.165 "read": true, 00:16:22.165 "write": true, 00:16:22.165 "unmap": true, 00:16:22.165 "flush": true, 00:16:22.165 "reset": true, 00:16:22.165 "nvme_admin": false, 00:16:22.165 "nvme_io": false, 00:16:22.165 "nvme_io_md": false, 00:16:22.165 "write_zeroes": true, 00:16:22.165 "zcopy": false, 00:16:22.165 "get_zone_info": false, 00:16:22.165 "zone_management": false, 00:16:22.165 "zone_append": false, 00:16:22.165 "compare": false, 00:16:22.165 "compare_and_write": false, 00:16:22.165 "abort": false, 00:16:22.165 "seek_hole": false, 00:16:22.165 "seek_data": false, 00:16:22.165 "copy": false, 00:16:22.165 "nvme_iov_md": false 00:16:22.165 }, 00:16:22.165 "memory_domains": [ 00:16:22.165 { 00:16:22.165 "dma_device_id": "system", 00:16:22.165 "dma_device_type": 1 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.165 "dma_device_type": 2 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "dma_device_id": "system", 00:16:22.165 "dma_device_type": 1 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.165 "dma_device_type": 2 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "dma_device_id": "system", 00:16:22.165 "dma_device_type": 1 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.165 "dma_device_type": 2 00:16:22.165 } 00:16:22.165 ], 00:16:22.165 "driver_specific": { 00:16:22.165 "raid": { 00:16:22.165 "uuid": "ff39b4ab-e334-4832-ba71-51110b262ee9", 00:16:22.165 "strip_size_kb": 64, 00:16:22.165 "state": "online", 00:16:22.165 "raid_level": "concat", 00:16:22.165 "superblock": true, 00:16:22.165 "num_base_bdevs": 3, 00:16:22.165 "num_base_bdevs_discovered": 3, 00:16:22.165 "num_base_bdevs_operational": 3, 00:16:22.165 "base_bdevs_list": [ 00:16:22.165 { 00:16:22.165 "name": "NewBaseBdev", 00:16:22.165 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:22.165 "is_configured": true, 00:16:22.165 "data_offset": 2048, 00:16:22.165 "data_size": 63488 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "name": "BaseBdev2", 00:16:22.165 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:22.165 "is_configured": true, 00:16:22.165 "data_offset": 2048, 00:16:22.165 "data_size": 63488 00:16:22.165 }, 00:16:22.165 { 00:16:22.165 "name": "BaseBdev3", 00:16:22.165 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:22.165 "is_configured": true, 00:16:22.165 "data_offset": 2048, 00:16:22.165 "data_size": 63488 00:16:22.165 } 00:16:22.165 ] 00:16:22.165 } 00:16:22.165 } 00:16:22.165 }' 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:22.165 BaseBdev2 00:16:22.165 BaseBdev3' 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:22.165 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.424 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.424 "name": "NewBaseBdev", 00:16:22.424 "aliases": [ 00:16:22.424 "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e" 00:16:22.424 ], 00:16:22.424 "product_name": "Malloc disk", 00:16:22.424 "block_size": 512, 00:16:22.424 "num_blocks": 65536, 00:16:22.424 "uuid": "8850f7b9-f66b-4cde-b657-d34b5d9a8d0e", 00:16:22.424 "assigned_rate_limits": { 00:16:22.424 "rw_ios_per_sec": 0, 00:16:22.424 "rw_mbytes_per_sec": 0, 00:16:22.424 "r_mbytes_per_sec": 0, 00:16:22.425 "w_mbytes_per_sec": 0 00:16:22.425 }, 00:16:22.425 "claimed": true, 00:16:22.425 "claim_type": "exclusive_write", 00:16:22.425 "zoned": false, 00:16:22.425 "supported_io_types": { 00:16:22.425 "read": true, 00:16:22.425 "write": true, 00:16:22.425 "unmap": true, 00:16:22.425 "flush": true, 00:16:22.425 "reset": true, 00:16:22.425 "nvme_admin": false, 00:16:22.425 "nvme_io": false, 00:16:22.425 "nvme_io_md": false, 00:16:22.425 "write_zeroes": true, 00:16:22.425 "zcopy": true, 00:16:22.425 "get_zone_info": false, 00:16:22.425 "zone_management": false, 00:16:22.425 "zone_append": false, 00:16:22.425 "compare": false, 00:16:22.425 "compare_and_write": false, 00:16:22.425 "abort": true, 00:16:22.425 "seek_hole": false, 00:16:22.425 "seek_data": false, 00:16:22.425 "copy": true, 00:16:22.425 "nvme_iov_md": false 00:16:22.425 }, 00:16:22.425 "memory_domains": [ 00:16:22.425 { 00:16:22.425 "dma_device_id": "system", 00:16:22.425 "dma_device_type": 1 00:16:22.425 }, 00:16:22.425 { 00:16:22.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.425 "dma_device_type": 2 00:16:22.425 } 00:16:22.425 ], 00:16:22.425 "driver_specific": {} 00:16:22.425 }' 00:16:22.425 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.425 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.425 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.425 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:22.684 22:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.943 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.943 "name": "BaseBdev2", 00:16:22.943 "aliases": [ 00:16:22.943 "26697972-fa17-4533-8cd2-ce1a427f0863" 00:16:22.943 ], 00:16:22.943 "product_name": "Malloc disk", 00:16:22.943 "block_size": 512, 00:16:22.943 "num_blocks": 65536, 00:16:22.943 "uuid": "26697972-fa17-4533-8cd2-ce1a427f0863", 00:16:22.943 "assigned_rate_limits": { 00:16:22.943 "rw_ios_per_sec": 0, 00:16:22.943 "rw_mbytes_per_sec": 0, 00:16:22.943 "r_mbytes_per_sec": 0, 00:16:22.943 "w_mbytes_per_sec": 0 00:16:22.943 }, 00:16:22.943 "claimed": true, 00:16:22.943 "claim_type": "exclusive_write", 00:16:22.943 "zoned": false, 00:16:22.943 "supported_io_types": { 00:16:22.943 "read": true, 00:16:22.943 "write": true, 00:16:22.943 "unmap": true, 00:16:22.943 "flush": true, 00:16:22.943 "reset": true, 00:16:22.943 "nvme_admin": false, 00:16:22.943 "nvme_io": false, 00:16:22.943 "nvme_io_md": false, 00:16:22.943 "write_zeroes": true, 00:16:22.943 "zcopy": true, 00:16:22.943 "get_zone_info": false, 00:16:22.943 "zone_management": false, 00:16:22.943 "zone_append": false, 00:16:22.943 "compare": false, 00:16:22.943 "compare_and_write": false, 00:16:22.943 "abort": true, 00:16:22.943 "seek_hole": false, 00:16:22.943 "seek_data": false, 00:16:22.943 "copy": true, 00:16:22.943 "nvme_iov_md": false 00:16:22.943 }, 00:16:22.943 "memory_domains": [ 00:16:22.943 { 00:16:22.943 "dma_device_id": "system", 00:16:22.943 "dma_device_type": 1 00:16:22.943 }, 00:16:22.943 { 00:16:22.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.943 "dma_device_type": 2 00:16:22.943 } 00:16:22.943 ], 00:16:22.943 "driver_specific": {} 00:16:22.943 }' 00:16:22.943 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:23.201 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.459 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.459 "name": "BaseBdev3", 00:16:23.459 "aliases": [ 00:16:23.459 "c542222a-b616-4b3e-935d-70069d13c8bd" 00:16:23.459 ], 00:16:23.459 "product_name": "Malloc disk", 00:16:23.459 "block_size": 512, 00:16:23.459 "num_blocks": 65536, 00:16:23.459 "uuid": "c542222a-b616-4b3e-935d-70069d13c8bd", 00:16:23.459 "assigned_rate_limits": { 00:16:23.459 "rw_ios_per_sec": 0, 00:16:23.459 "rw_mbytes_per_sec": 0, 00:16:23.459 "r_mbytes_per_sec": 0, 00:16:23.459 "w_mbytes_per_sec": 0 00:16:23.459 }, 00:16:23.459 "claimed": true, 00:16:23.459 "claim_type": "exclusive_write", 00:16:23.459 "zoned": false, 00:16:23.459 "supported_io_types": { 00:16:23.459 "read": true, 00:16:23.459 "write": true, 00:16:23.459 "unmap": true, 00:16:23.459 "flush": true, 00:16:23.459 "reset": true, 00:16:23.459 "nvme_admin": false, 00:16:23.459 "nvme_io": false, 00:16:23.459 "nvme_io_md": false, 00:16:23.459 "write_zeroes": true, 00:16:23.459 "zcopy": true, 00:16:23.459 "get_zone_info": false, 00:16:23.459 "zone_management": false, 00:16:23.459 "zone_append": false, 00:16:23.459 "compare": false, 00:16:23.459 "compare_and_write": false, 00:16:23.459 "abort": true, 00:16:23.459 "seek_hole": false, 00:16:23.459 "seek_data": false, 00:16:23.459 "copy": true, 00:16:23.459 "nvme_iov_md": false 00:16:23.459 }, 00:16:23.459 "memory_domains": [ 00:16:23.459 { 00:16:23.459 "dma_device_id": "system", 00:16:23.459 "dma_device_type": 1 00:16:23.459 }, 00:16:23.459 { 00:16:23.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.459 "dma_device_type": 2 00:16:23.459 } 00:16:23.459 ], 00:16:23.459 "driver_specific": {} 00:16:23.459 }' 00:16:23.459 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.717 22:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.717 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.717 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.976 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.976 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.976 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:24.235 [2024-07-12 22:23:34.320755] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:24.235 [2024-07-12 22:23:34.320784] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:24.235 [2024-07-12 22:23:34.320837] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:24.235 [2024-07-12 22:23:34.320889] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:24.235 [2024-07-12 22:23:34.320908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2471f50 name Existed_Raid, state offline 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3458967 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3458967 ']' 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3458967 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3458967 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3458967' 00:16:24.235 killing process with pid 3458967 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3458967 00:16:24.235 [2024-07-12 22:23:34.388135] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:24.235 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3458967 00:16:24.235 [2024-07-12 22:23:34.415005] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:24.494 22:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:24.494 00:16:24.494 real 0m28.110s 00:16:24.494 user 0m51.587s 00:16:24.494 sys 0m5.077s 00:16:24.494 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:24.494 22:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:24.494 ************************************ 00:16:24.494 END TEST raid_state_function_test_sb 00:16:24.494 ************************************ 00:16:24.494 22:23:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:24.494 22:23:34 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:24.494 22:23:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:24.494 22:23:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:24.494 22:23:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:24.494 ************************************ 00:16:24.494 START TEST raid_superblock_test 00:16:24.494 ************************************ 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:24.494 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3463179 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3463179 /var/tmp/spdk-raid.sock 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3463179 ']' 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:24.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:24.495 22:23:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.495 [2024-07-12 22:23:34.784903] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:16:24.495 [2024-07-12 22:23:34.784977] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3463179 ] 00:16:24.754 [2024-07-12 22:23:34.914586] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:24.754 [2024-07-12 22:23:35.020875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:25.012 [2024-07-12 22:23:35.085058] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:25.012 [2024-07-12 22:23:35.085094] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:25.580 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:25.839 malloc1 00:16:25.839 22:23:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:26.201 [2024-07-12 22:23:36.214575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:26.201 [2024-07-12 22:23:36.214627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.201 [2024-07-12 22:23:36.214649] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb1570 00:16:26.201 [2024-07-12 22:23:36.214663] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.201 [2024-07-12 22:23:36.216402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.201 [2024-07-12 22:23:36.216433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:26.201 pt1 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:26.202 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:26.464 malloc2 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:26.465 [2024-07-12 22:23:36.701913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:26.465 [2024-07-12 22:23:36.701971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.465 [2024-07-12 22:23:36.701992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb2970 00:16:26.465 [2024-07-12 22:23:36.702006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.465 [2024-07-12 22:23:36.703657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.465 [2024-07-12 22:23:36.703686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:26.465 pt2 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:26.465 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:26.723 malloc3 00:16:26.723 22:23:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:26.983 [2024-07-12 22:23:37.205009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:26.983 [2024-07-12 22:23:37.205056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.983 [2024-07-12 22:23:37.205077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd49340 00:16:26.983 [2024-07-12 22:23:37.205090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.983 [2024-07-12 22:23:37.206653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.983 [2024-07-12 22:23:37.206681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:26.983 pt3 00:16:26.983 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:26.983 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:26.983 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:27.242 [2024-07-12 22:23:37.437646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:27.242 [2024-07-12 22:23:37.438953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:27.242 [2024-07-12 22:23:37.439009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:27.242 [2024-07-12 22:23:37.439162] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xba9ea0 00:16:27.242 [2024-07-12 22:23:37.439173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:27.242 [2024-07-12 22:23:37.439370] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb1240 00:16:27.242 [2024-07-12 22:23:37.439513] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xba9ea0 00:16:27.242 [2024-07-12 22:23:37.439523] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xba9ea0 00:16:27.242 [2024-07-12 22:23:37.439625] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.242 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:27.503 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.503 "name": "raid_bdev1", 00:16:27.503 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:27.503 "strip_size_kb": 64, 00:16:27.503 "state": "online", 00:16:27.503 "raid_level": "concat", 00:16:27.503 "superblock": true, 00:16:27.503 "num_base_bdevs": 3, 00:16:27.503 "num_base_bdevs_discovered": 3, 00:16:27.503 "num_base_bdevs_operational": 3, 00:16:27.503 "base_bdevs_list": [ 00:16:27.503 { 00:16:27.503 "name": "pt1", 00:16:27.503 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:27.503 "is_configured": true, 00:16:27.503 "data_offset": 2048, 00:16:27.503 "data_size": 63488 00:16:27.503 }, 00:16:27.503 { 00:16:27.503 "name": "pt2", 00:16:27.503 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:27.503 "is_configured": true, 00:16:27.503 "data_offset": 2048, 00:16:27.503 "data_size": 63488 00:16:27.503 }, 00:16:27.503 { 00:16:27.503 "name": "pt3", 00:16:27.503 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:27.503 "is_configured": true, 00:16:27.503 "data_offset": 2048, 00:16:27.503 "data_size": 63488 00:16:27.503 } 00:16:27.503 ] 00:16:27.503 }' 00:16:27.503 22:23:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.503 22:23:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:28.071 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:28.330 [2024-07-12 22:23:38.516738] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.330 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:28.330 "name": "raid_bdev1", 00:16:28.330 "aliases": [ 00:16:28.330 "8e105a71-9fa9-4922-9fdb-d8380c456132" 00:16:28.330 ], 00:16:28.330 "product_name": "Raid Volume", 00:16:28.330 "block_size": 512, 00:16:28.330 "num_blocks": 190464, 00:16:28.330 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:28.330 "assigned_rate_limits": { 00:16:28.330 "rw_ios_per_sec": 0, 00:16:28.330 "rw_mbytes_per_sec": 0, 00:16:28.330 "r_mbytes_per_sec": 0, 00:16:28.330 "w_mbytes_per_sec": 0 00:16:28.330 }, 00:16:28.330 "claimed": false, 00:16:28.330 "zoned": false, 00:16:28.330 "supported_io_types": { 00:16:28.330 "read": true, 00:16:28.330 "write": true, 00:16:28.330 "unmap": true, 00:16:28.330 "flush": true, 00:16:28.330 "reset": true, 00:16:28.330 "nvme_admin": false, 00:16:28.330 "nvme_io": false, 00:16:28.330 "nvme_io_md": false, 00:16:28.330 "write_zeroes": true, 00:16:28.330 "zcopy": false, 00:16:28.330 "get_zone_info": false, 00:16:28.330 "zone_management": false, 00:16:28.330 "zone_append": false, 00:16:28.330 "compare": false, 00:16:28.330 "compare_and_write": false, 00:16:28.330 "abort": false, 00:16:28.330 "seek_hole": false, 00:16:28.330 "seek_data": false, 00:16:28.330 "copy": false, 00:16:28.330 "nvme_iov_md": false 00:16:28.330 }, 00:16:28.330 "memory_domains": [ 00:16:28.330 { 00:16:28.330 "dma_device_id": "system", 00:16:28.330 "dma_device_type": 1 00:16:28.330 }, 00:16:28.330 { 00:16:28.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.330 "dma_device_type": 2 00:16:28.330 }, 00:16:28.330 { 00:16:28.330 "dma_device_id": "system", 00:16:28.330 "dma_device_type": 1 00:16:28.330 }, 00:16:28.330 { 00:16:28.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.331 "dma_device_type": 2 00:16:28.331 }, 00:16:28.331 { 00:16:28.331 "dma_device_id": "system", 00:16:28.331 "dma_device_type": 1 00:16:28.331 }, 00:16:28.331 { 00:16:28.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.331 "dma_device_type": 2 00:16:28.331 } 00:16:28.331 ], 00:16:28.331 "driver_specific": { 00:16:28.331 "raid": { 00:16:28.331 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:28.331 "strip_size_kb": 64, 00:16:28.331 "state": "online", 00:16:28.331 "raid_level": "concat", 00:16:28.331 "superblock": true, 00:16:28.331 "num_base_bdevs": 3, 00:16:28.331 "num_base_bdevs_discovered": 3, 00:16:28.331 "num_base_bdevs_operational": 3, 00:16:28.331 "base_bdevs_list": [ 00:16:28.331 { 00:16:28.331 "name": "pt1", 00:16:28.331 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.331 "is_configured": true, 00:16:28.331 "data_offset": 2048, 00:16:28.331 "data_size": 63488 00:16:28.331 }, 00:16:28.331 { 00:16:28.331 "name": "pt2", 00:16:28.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:28.331 "is_configured": true, 00:16:28.331 "data_offset": 2048, 00:16:28.331 "data_size": 63488 00:16:28.331 }, 00:16:28.331 { 00:16:28.331 "name": "pt3", 00:16:28.331 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:28.331 "is_configured": true, 00:16:28.331 "data_offset": 2048, 00:16:28.331 "data_size": 63488 00:16:28.331 } 00:16:28.331 ] 00:16:28.331 } 00:16:28.331 } 00:16:28.331 }' 00:16:28.331 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:28.331 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:28.331 pt2 00:16:28.331 pt3' 00:16:28.331 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.331 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:28.331 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:28.590 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:28.590 "name": "pt1", 00:16:28.590 "aliases": [ 00:16:28.590 "00000000-0000-0000-0000-000000000001" 00:16:28.590 ], 00:16:28.590 "product_name": "passthru", 00:16:28.590 "block_size": 512, 00:16:28.590 "num_blocks": 65536, 00:16:28.590 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.590 "assigned_rate_limits": { 00:16:28.590 "rw_ios_per_sec": 0, 00:16:28.590 "rw_mbytes_per_sec": 0, 00:16:28.590 "r_mbytes_per_sec": 0, 00:16:28.590 "w_mbytes_per_sec": 0 00:16:28.590 }, 00:16:28.590 "claimed": true, 00:16:28.590 "claim_type": "exclusive_write", 00:16:28.590 "zoned": false, 00:16:28.590 "supported_io_types": { 00:16:28.590 "read": true, 00:16:28.591 "write": true, 00:16:28.591 "unmap": true, 00:16:28.591 "flush": true, 00:16:28.591 "reset": true, 00:16:28.591 "nvme_admin": false, 00:16:28.591 "nvme_io": false, 00:16:28.591 "nvme_io_md": false, 00:16:28.591 "write_zeroes": true, 00:16:28.591 "zcopy": true, 00:16:28.591 "get_zone_info": false, 00:16:28.591 "zone_management": false, 00:16:28.591 "zone_append": false, 00:16:28.591 "compare": false, 00:16:28.591 "compare_and_write": false, 00:16:28.591 "abort": true, 00:16:28.591 "seek_hole": false, 00:16:28.591 "seek_data": false, 00:16:28.591 "copy": true, 00:16:28.591 "nvme_iov_md": false 00:16:28.591 }, 00:16:28.591 "memory_domains": [ 00:16:28.591 { 00:16:28.591 "dma_device_id": "system", 00:16:28.591 "dma_device_type": 1 00:16:28.591 }, 00:16:28.591 { 00:16:28.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.591 "dma_device_type": 2 00:16:28.591 } 00:16:28.591 ], 00:16:28.591 "driver_specific": { 00:16:28.591 "passthru": { 00:16:28.591 "name": "pt1", 00:16:28.591 "base_bdev_name": "malloc1" 00:16:28.591 } 00:16:28.591 } 00:16:28.591 }' 00:16:28.591 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.591 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:28.591 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:28.591 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.591 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:28.850 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:28.850 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.850 22:23:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:28.850 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.109 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.109 "name": "pt2", 00:16:29.109 "aliases": [ 00:16:29.109 "00000000-0000-0000-0000-000000000002" 00:16:29.109 ], 00:16:29.109 "product_name": "passthru", 00:16:29.109 "block_size": 512, 00:16:29.109 "num_blocks": 65536, 00:16:29.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:29.109 "assigned_rate_limits": { 00:16:29.109 "rw_ios_per_sec": 0, 00:16:29.109 "rw_mbytes_per_sec": 0, 00:16:29.109 "r_mbytes_per_sec": 0, 00:16:29.109 "w_mbytes_per_sec": 0 00:16:29.109 }, 00:16:29.109 "claimed": true, 00:16:29.109 "claim_type": "exclusive_write", 00:16:29.109 "zoned": false, 00:16:29.109 "supported_io_types": { 00:16:29.109 "read": true, 00:16:29.109 "write": true, 00:16:29.109 "unmap": true, 00:16:29.109 "flush": true, 00:16:29.109 "reset": true, 00:16:29.109 "nvme_admin": false, 00:16:29.109 "nvme_io": false, 00:16:29.110 "nvme_io_md": false, 00:16:29.110 "write_zeroes": true, 00:16:29.110 "zcopy": true, 00:16:29.110 "get_zone_info": false, 00:16:29.110 "zone_management": false, 00:16:29.110 "zone_append": false, 00:16:29.110 "compare": false, 00:16:29.110 "compare_and_write": false, 00:16:29.110 "abort": true, 00:16:29.110 "seek_hole": false, 00:16:29.110 "seek_data": false, 00:16:29.110 "copy": true, 00:16:29.110 "nvme_iov_md": false 00:16:29.110 }, 00:16:29.110 "memory_domains": [ 00:16:29.110 { 00:16:29.110 "dma_device_id": "system", 00:16:29.110 "dma_device_type": 1 00:16:29.110 }, 00:16:29.110 { 00:16:29.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.110 "dma_device_type": 2 00:16:29.110 } 00:16:29.110 ], 00:16:29.110 "driver_specific": { 00:16:29.110 "passthru": { 00:16:29.110 "name": "pt2", 00:16:29.110 "base_bdev_name": "malloc2" 00:16:29.110 } 00:16:29.110 } 00:16:29.110 }' 00:16:29.110 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.110 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.369 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:29.629 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.629 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.629 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:29.629 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.889 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.889 "name": "pt3", 00:16:29.889 "aliases": [ 00:16:29.889 "00000000-0000-0000-0000-000000000003" 00:16:29.889 ], 00:16:29.889 "product_name": "passthru", 00:16:29.889 "block_size": 512, 00:16:29.889 "num_blocks": 65536, 00:16:29.889 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:29.889 "assigned_rate_limits": { 00:16:29.889 "rw_ios_per_sec": 0, 00:16:29.889 "rw_mbytes_per_sec": 0, 00:16:29.889 "r_mbytes_per_sec": 0, 00:16:29.889 "w_mbytes_per_sec": 0 00:16:29.889 }, 00:16:29.889 "claimed": true, 00:16:29.889 "claim_type": "exclusive_write", 00:16:29.889 "zoned": false, 00:16:29.889 "supported_io_types": { 00:16:29.889 "read": true, 00:16:29.889 "write": true, 00:16:29.889 "unmap": true, 00:16:29.889 "flush": true, 00:16:29.889 "reset": true, 00:16:29.889 "nvme_admin": false, 00:16:29.889 "nvme_io": false, 00:16:29.889 "nvme_io_md": false, 00:16:29.889 "write_zeroes": true, 00:16:29.889 "zcopy": true, 00:16:29.889 "get_zone_info": false, 00:16:29.889 "zone_management": false, 00:16:29.889 "zone_append": false, 00:16:29.889 "compare": false, 00:16:29.889 "compare_and_write": false, 00:16:29.889 "abort": true, 00:16:29.889 "seek_hole": false, 00:16:29.889 "seek_data": false, 00:16:29.889 "copy": true, 00:16:29.889 "nvme_iov_md": false 00:16:29.889 }, 00:16:29.889 "memory_domains": [ 00:16:29.889 { 00:16:29.889 "dma_device_id": "system", 00:16:29.889 "dma_device_type": 1 00:16:29.889 }, 00:16:29.889 { 00:16:29.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.889 "dma_device_type": 2 00:16:29.889 } 00:16:29.889 ], 00:16:29.889 "driver_specific": { 00:16:29.889 "passthru": { 00:16:29.889 "name": "pt3", 00:16:29.889 "base_bdev_name": "malloc3" 00:16:29.889 } 00:16:29.889 } 00:16:29.889 }' 00:16:29.889 22:23:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.889 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:30.148 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:30.407 [2024-07-12 22:23:40.566243] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:30.407 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8e105a71-9fa9-4922-9fdb-d8380c456132 00:16:30.407 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8e105a71-9fa9-4922-9fdb-d8380c456132 ']' 00:16:30.407 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:30.665 [2024-07-12 22:23:40.810604] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:30.665 [2024-07-12 22:23:40.810628] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:30.665 [2024-07-12 22:23:40.810679] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:30.665 [2024-07-12 22:23:40.810732] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:30.665 [2024-07-12 22:23:40.810749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba9ea0 name raid_bdev1, state offline 00:16:30.665 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:30.665 22:23:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.924 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:30.924 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:30.924 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:30.924 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:31.182 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:31.182 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:31.440 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:31.440 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:31.698 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:31.698 22:23:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:31.957 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:31.957 [2024-07-12 22:23:42.274484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:31.957 [2024-07-12 22:23:42.275828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:31.957 [2024-07-12 22:23:42.275870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:31.957 [2024-07-12 22:23:42.275923] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:31.957 [2024-07-12 22:23:42.275973] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:31.957 [2024-07-12 22:23:42.275996] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:31.957 [2024-07-12 22:23:42.276014] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:31.957 [2024-07-12 22:23:42.276026] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd54ff0 name raid_bdev1, state configuring 00:16:31.957 request: 00:16:31.957 { 00:16:31.957 "name": "raid_bdev1", 00:16:31.957 "raid_level": "concat", 00:16:31.957 "base_bdevs": [ 00:16:31.957 "malloc1", 00:16:31.957 "malloc2", 00:16:31.957 "malloc3" 00:16:31.957 ], 00:16:31.957 "strip_size_kb": 64, 00:16:31.957 "superblock": false, 00:16:31.957 "method": "bdev_raid_create", 00:16:31.957 "req_id": 1 00:16:31.957 } 00:16:31.957 Got JSON-RPC error response 00:16:31.957 response: 00:16:31.957 { 00:16:31.957 "code": -17, 00:16:31.957 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:31.957 } 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:32.260 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:32.519 [2024-07-12 22:23:42.767722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:32.519 [2024-07-12 22:23:42.767766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:32.519 [2024-07-12 22:23:42.767788] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb17a0 00:16:32.519 [2024-07-12 22:23:42.767800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:32.519 [2024-07-12 22:23:42.769404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:32.519 [2024-07-12 22:23:42.769432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:32.519 [2024-07-12 22:23:42.769497] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:32.519 [2024-07-12 22:23:42.769526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:32.519 pt1 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.519 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.520 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.520 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.520 22:23:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:32.778 22:23:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.778 "name": "raid_bdev1", 00:16:32.778 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:32.778 "strip_size_kb": 64, 00:16:32.778 "state": "configuring", 00:16:32.778 "raid_level": "concat", 00:16:32.778 "superblock": true, 00:16:32.778 "num_base_bdevs": 3, 00:16:32.778 "num_base_bdevs_discovered": 1, 00:16:32.778 "num_base_bdevs_operational": 3, 00:16:32.778 "base_bdevs_list": [ 00:16:32.778 { 00:16:32.778 "name": "pt1", 00:16:32.778 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:32.778 "is_configured": true, 00:16:32.778 "data_offset": 2048, 00:16:32.778 "data_size": 63488 00:16:32.778 }, 00:16:32.778 { 00:16:32.778 "name": null, 00:16:32.778 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:32.778 "is_configured": false, 00:16:32.778 "data_offset": 2048, 00:16:32.778 "data_size": 63488 00:16:32.778 }, 00:16:32.778 { 00:16:32.779 "name": null, 00:16:32.779 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:32.779 "is_configured": false, 00:16:32.779 "data_offset": 2048, 00:16:32.779 "data_size": 63488 00:16:32.779 } 00:16:32.779 ] 00:16:32.779 }' 00:16:32.779 22:23:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.779 22:23:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.346 22:23:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:33.346 22:23:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:33.605 [2024-07-12 22:23:43.870652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:33.605 [2024-07-12 22:23:43.870699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:33.605 [2024-07-12 22:23:43.870719] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xba8c70 00:16:33.605 [2024-07-12 22:23:43.870731] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:33.605 [2024-07-12 22:23:43.871078] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:33.605 [2024-07-12 22:23:43.871096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:33.605 [2024-07-12 22:23:43.871157] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:33.605 [2024-07-12 22:23:43.871176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:33.605 pt2 00:16:33.605 22:23:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:33.864 [2024-07-12 22:23:44.115306] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.864 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:34.123 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.123 "name": "raid_bdev1", 00:16:34.123 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:34.123 "strip_size_kb": 64, 00:16:34.123 "state": "configuring", 00:16:34.123 "raid_level": "concat", 00:16:34.123 "superblock": true, 00:16:34.123 "num_base_bdevs": 3, 00:16:34.123 "num_base_bdevs_discovered": 1, 00:16:34.123 "num_base_bdevs_operational": 3, 00:16:34.123 "base_bdevs_list": [ 00:16:34.123 { 00:16:34.123 "name": "pt1", 00:16:34.123 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:34.123 "is_configured": true, 00:16:34.123 "data_offset": 2048, 00:16:34.123 "data_size": 63488 00:16:34.123 }, 00:16:34.123 { 00:16:34.123 "name": null, 00:16:34.123 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:34.123 "is_configured": false, 00:16:34.123 "data_offset": 2048, 00:16:34.123 "data_size": 63488 00:16:34.123 }, 00:16:34.123 { 00:16:34.123 "name": null, 00:16:34.123 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:34.123 "is_configured": false, 00:16:34.123 "data_offset": 2048, 00:16:34.123 "data_size": 63488 00:16:34.123 } 00:16:34.123 ] 00:16:34.123 }' 00:16:34.123 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.123 22:23:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.692 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:34.692 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:34.692 22:23:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:34.951 [2024-07-12 22:23:45.182119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:34.951 [2024-07-12 22:23:45.182167] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:34.951 [2024-07-12 22:23:45.182190] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb1a10 00:16:34.951 [2024-07-12 22:23:45.182203] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:34.951 [2024-07-12 22:23:45.182528] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:34.951 [2024-07-12 22:23:45.182544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:34.951 [2024-07-12 22:23:45.182604] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:34.951 [2024-07-12 22:23:45.182622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:34.951 pt2 00:16:34.951 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:34.951 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:34.951 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:35.211 [2024-07-12 22:23:45.426762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:35.211 [2024-07-12 22:23:45.426793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:35.211 [2024-07-12 22:23:45.426809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd4b740 00:16:35.211 [2024-07-12 22:23:45.426821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:35.211 [2024-07-12 22:23:45.427095] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:35.211 [2024-07-12 22:23:45.427113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:35.211 [2024-07-12 22:23:45.427163] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:35.211 [2024-07-12 22:23:45.427180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:35.211 [2024-07-12 22:23:45.427284] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd4bc00 00:16:35.211 [2024-07-12 22:23:45.427294] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:35.211 [2024-07-12 22:23:45.427457] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb0a40 00:16:35.211 [2024-07-12 22:23:45.427577] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd4bc00 00:16:35.211 [2024-07-12 22:23:45.427587] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd4bc00 00:16:35.211 [2024-07-12 22:23:45.427692] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:35.211 pt3 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:35.211 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.470 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.470 "name": "raid_bdev1", 00:16:35.470 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:35.470 "strip_size_kb": 64, 00:16:35.470 "state": "online", 00:16:35.470 "raid_level": "concat", 00:16:35.470 "superblock": true, 00:16:35.470 "num_base_bdevs": 3, 00:16:35.470 "num_base_bdevs_discovered": 3, 00:16:35.470 "num_base_bdevs_operational": 3, 00:16:35.470 "base_bdevs_list": [ 00:16:35.470 { 00:16:35.470 "name": "pt1", 00:16:35.470 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:35.470 "is_configured": true, 00:16:35.470 "data_offset": 2048, 00:16:35.470 "data_size": 63488 00:16:35.470 }, 00:16:35.470 { 00:16:35.470 "name": "pt2", 00:16:35.470 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:35.470 "is_configured": true, 00:16:35.470 "data_offset": 2048, 00:16:35.470 "data_size": 63488 00:16:35.470 }, 00:16:35.470 { 00:16:35.470 "name": "pt3", 00:16:35.470 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:35.470 "is_configured": true, 00:16:35.470 "data_offset": 2048, 00:16:35.470 "data_size": 63488 00:16:35.470 } 00:16:35.470 ] 00:16:35.470 }' 00:16:35.470 22:23:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.470 22:23:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:36.039 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:36.298 [2024-07-12 22:23:46.517946] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:36.298 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:36.298 "name": "raid_bdev1", 00:16:36.298 "aliases": [ 00:16:36.298 "8e105a71-9fa9-4922-9fdb-d8380c456132" 00:16:36.298 ], 00:16:36.298 "product_name": "Raid Volume", 00:16:36.298 "block_size": 512, 00:16:36.298 "num_blocks": 190464, 00:16:36.298 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:36.298 "assigned_rate_limits": { 00:16:36.298 "rw_ios_per_sec": 0, 00:16:36.298 "rw_mbytes_per_sec": 0, 00:16:36.298 "r_mbytes_per_sec": 0, 00:16:36.298 "w_mbytes_per_sec": 0 00:16:36.298 }, 00:16:36.298 "claimed": false, 00:16:36.298 "zoned": false, 00:16:36.298 "supported_io_types": { 00:16:36.298 "read": true, 00:16:36.298 "write": true, 00:16:36.298 "unmap": true, 00:16:36.298 "flush": true, 00:16:36.298 "reset": true, 00:16:36.298 "nvme_admin": false, 00:16:36.298 "nvme_io": false, 00:16:36.298 "nvme_io_md": false, 00:16:36.298 "write_zeroes": true, 00:16:36.298 "zcopy": false, 00:16:36.298 "get_zone_info": false, 00:16:36.298 "zone_management": false, 00:16:36.298 "zone_append": false, 00:16:36.298 "compare": false, 00:16:36.298 "compare_and_write": false, 00:16:36.298 "abort": false, 00:16:36.298 "seek_hole": false, 00:16:36.298 "seek_data": false, 00:16:36.298 "copy": false, 00:16:36.298 "nvme_iov_md": false 00:16:36.298 }, 00:16:36.299 "memory_domains": [ 00:16:36.299 { 00:16:36.299 "dma_device_id": "system", 00:16:36.299 "dma_device_type": 1 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.299 "dma_device_type": 2 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "dma_device_id": "system", 00:16:36.299 "dma_device_type": 1 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.299 "dma_device_type": 2 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "dma_device_id": "system", 00:16:36.299 "dma_device_type": 1 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.299 "dma_device_type": 2 00:16:36.299 } 00:16:36.299 ], 00:16:36.299 "driver_specific": { 00:16:36.299 "raid": { 00:16:36.299 "uuid": "8e105a71-9fa9-4922-9fdb-d8380c456132", 00:16:36.299 "strip_size_kb": 64, 00:16:36.299 "state": "online", 00:16:36.299 "raid_level": "concat", 00:16:36.299 "superblock": true, 00:16:36.299 "num_base_bdevs": 3, 00:16:36.299 "num_base_bdevs_discovered": 3, 00:16:36.299 "num_base_bdevs_operational": 3, 00:16:36.299 "base_bdevs_list": [ 00:16:36.299 { 00:16:36.299 "name": "pt1", 00:16:36.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:36.299 "is_configured": true, 00:16:36.299 "data_offset": 2048, 00:16:36.299 "data_size": 63488 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "name": "pt2", 00:16:36.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:36.299 "is_configured": true, 00:16:36.299 "data_offset": 2048, 00:16:36.299 "data_size": 63488 00:16:36.299 }, 00:16:36.299 { 00:16:36.299 "name": "pt3", 00:16:36.299 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:36.299 "is_configured": true, 00:16:36.299 "data_offset": 2048, 00:16:36.299 "data_size": 63488 00:16:36.299 } 00:16:36.299 ] 00:16:36.299 } 00:16:36.299 } 00:16:36.299 }' 00:16:36.299 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:36.299 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:36.299 pt2 00:16:36.299 pt3' 00:16:36.299 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.299 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:36.299 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.558 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.558 "name": "pt1", 00:16:36.558 "aliases": [ 00:16:36.558 "00000000-0000-0000-0000-000000000001" 00:16:36.558 ], 00:16:36.558 "product_name": "passthru", 00:16:36.558 "block_size": 512, 00:16:36.558 "num_blocks": 65536, 00:16:36.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:36.558 "assigned_rate_limits": { 00:16:36.558 "rw_ios_per_sec": 0, 00:16:36.558 "rw_mbytes_per_sec": 0, 00:16:36.558 "r_mbytes_per_sec": 0, 00:16:36.558 "w_mbytes_per_sec": 0 00:16:36.558 }, 00:16:36.558 "claimed": true, 00:16:36.558 "claim_type": "exclusive_write", 00:16:36.558 "zoned": false, 00:16:36.558 "supported_io_types": { 00:16:36.558 "read": true, 00:16:36.558 "write": true, 00:16:36.558 "unmap": true, 00:16:36.558 "flush": true, 00:16:36.558 "reset": true, 00:16:36.558 "nvme_admin": false, 00:16:36.558 "nvme_io": false, 00:16:36.558 "nvme_io_md": false, 00:16:36.558 "write_zeroes": true, 00:16:36.559 "zcopy": true, 00:16:36.559 "get_zone_info": false, 00:16:36.559 "zone_management": false, 00:16:36.559 "zone_append": false, 00:16:36.559 "compare": false, 00:16:36.559 "compare_and_write": false, 00:16:36.559 "abort": true, 00:16:36.559 "seek_hole": false, 00:16:36.559 "seek_data": false, 00:16:36.559 "copy": true, 00:16:36.559 "nvme_iov_md": false 00:16:36.559 }, 00:16:36.559 "memory_domains": [ 00:16:36.559 { 00:16:36.559 "dma_device_id": "system", 00:16:36.559 "dma_device_type": 1 00:16:36.559 }, 00:16:36.559 { 00:16:36.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.559 "dma_device_type": 2 00:16:36.559 } 00:16:36.559 ], 00:16:36.559 "driver_specific": { 00:16:36.559 "passthru": { 00:16:36.559 "name": "pt1", 00:16:36.559 "base_bdev_name": "malloc1" 00:16:36.559 } 00:16:36.559 } 00:16:36.559 }' 00:16:36.559 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.559 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.818 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.818 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.818 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.818 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.818 22:23:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.818 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.818 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.818 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.818 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.819 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.819 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.819 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:36.819 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.078 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.078 "name": "pt2", 00:16:37.078 "aliases": [ 00:16:37.078 "00000000-0000-0000-0000-000000000002" 00:16:37.078 ], 00:16:37.078 "product_name": "passthru", 00:16:37.078 "block_size": 512, 00:16:37.078 "num_blocks": 65536, 00:16:37.078 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:37.078 "assigned_rate_limits": { 00:16:37.078 "rw_ios_per_sec": 0, 00:16:37.078 "rw_mbytes_per_sec": 0, 00:16:37.078 "r_mbytes_per_sec": 0, 00:16:37.078 "w_mbytes_per_sec": 0 00:16:37.078 }, 00:16:37.078 "claimed": true, 00:16:37.078 "claim_type": "exclusive_write", 00:16:37.078 "zoned": false, 00:16:37.078 "supported_io_types": { 00:16:37.078 "read": true, 00:16:37.078 "write": true, 00:16:37.078 "unmap": true, 00:16:37.078 "flush": true, 00:16:37.078 "reset": true, 00:16:37.078 "nvme_admin": false, 00:16:37.078 "nvme_io": false, 00:16:37.078 "nvme_io_md": false, 00:16:37.078 "write_zeroes": true, 00:16:37.078 "zcopy": true, 00:16:37.078 "get_zone_info": false, 00:16:37.078 "zone_management": false, 00:16:37.078 "zone_append": false, 00:16:37.078 "compare": false, 00:16:37.078 "compare_and_write": false, 00:16:37.078 "abort": true, 00:16:37.078 "seek_hole": false, 00:16:37.078 "seek_data": false, 00:16:37.078 "copy": true, 00:16:37.078 "nvme_iov_md": false 00:16:37.078 }, 00:16:37.078 "memory_domains": [ 00:16:37.078 { 00:16:37.078 "dma_device_id": "system", 00:16:37.078 "dma_device_type": 1 00:16:37.078 }, 00:16:37.078 { 00:16:37.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.078 "dma_device_type": 2 00:16:37.078 } 00:16:37.078 ], 00:16:37.078 "driver_specific": { 00:16:37.078 "passthru": { 00:16:37.078 "name": "pt2", 00:16:37.078 "base_bdev_name": "malloc2" 00:16:37.078 } 00:16:37.078 } 00:16:37.078 }' 00:16:37.078 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.337 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:37.597 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.858 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.858 "name": "pt3", 00:16:37.858 "aliases": [ 00:16:37.858 "00000000-0000-0000-0000-000000000003" 00:16:37.858 ], 00:16:37.858 "product_name": "passthru", 00:16:37.858 "block_size": 512, 00:16:37.858 "num_blocks": 65536, 00:16:37.858 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:37.858 "assigned_rate_limits": { 00:16:37.858 "rw_ios_per_sec": 0, 00:16:37.858 "rw_mbytes_per_sec": 0, 00:16:37.858 "r_mbytes_per_sec": 0, 00:16:37.858 "w_mbytes_per_sec": 0 00:16:37.858 }, 00:16:37.858 "claimed": true, 00:16:37.858 "claim_type": "exclusive_write", 00:16:37.858 "zoned": false, 00:16:37.858 "supported_io_types": { 00:16:37.858 "read": true, 00:16:37.858 "write": true, 00:16:37.858 "unmap": true, 00:16:37.858 "flush": true, 00:16:37.858 "reset": true, 00:16:37.858 "nvme_admin": false, 00:16:37.858 "nvme_io": false, 00:16:37.858 "nvme_io_md": false, 00:16:37.858 "write_zeroes": true, 00:16:37.858 "zcopy": true, 00:16:37.858 "get_zone_info": false, 00:16:37.858 "zone_management": false, 00:16:37.858 "zone_append": false, 00:16:37.858 "compare": false, 00:16:37.858 "compare_and_write": false, 00:16:37.858 "abort": true, 00:16:37.858 "seek_hole": false, 00:16:37.858 "seek_data": false, 00:16:37.858 "copy": true, 00:16:37.858 "nvme_iov_md": false 00:16:37.858 }, 00:16:37.858 "memory_domains": [ 00:16:37.858 { 00:16:37.858 "dma_device_id": "system", 00:16:37.858 "dma_device_type": 1 00:16:37.858 }, 00:16:37.858 { 00:16:37.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.858 "dma_device_type": 2 00:16:37.858 } 00:16:37.858 ], 00:16:37.858 "driver_specific": { 00:16:37.858 "passthru": { 00:16:37.858 "name": "pt3", 00:16:37.858 "base_bdev_name": "malloc3" 00:16:37.858 } 00:16:37.858 } 00:16:37.858 }' 00:16:37.858 22:23:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.858 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:38.119 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:38.379 [2024-07-12 22:23:48.515250] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8e105a71-9fa9-4922-9fdb-d8380c456132 '!=' 8e105a71-9fa9-4922-9fdb-d8380c456132 ']' 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3463179 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3463179 ']' 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3463179 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3463179 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3463179' 00:16:38.379 killing process with pid 3463179 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3463179 00:16:38.379 [2024-07-12 22:23:48.587905] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:38.379 [2024-07-12 22:23:48.587969] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:38.379 [2024-07-12 22:23:48.588039] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:38.379 [2024-07-12 22:23:48.588052] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd4bc00 name raid_bdev1, state offline 00:16:38.379 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3463179 00:16:38.379 [2024-07-12 22:23:48.618773] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:38.639 22:23:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:38.639 00:16:38.639 real 0m14.119s 00:16:38.639 user 0m25.404s 00:16:38.639 sys 0m2.547s 00:16:38.639 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:38.639 22:23:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.639 ************************************ 00:16:38.639 END TEST raid_superblock_test 00:16:38.639 ************************************ 00:16:38.639 22:23:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:38.639 22:23:48 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:38.639 22:23:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:38.639 22:23:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:38.639 22:23:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:38.639 ************************************ 00:16:38.639 START TEST raid_read_error_test 00:16:38.639 ************************************ 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TdSO7lxAg0 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3465370 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3465370 /var/tmp/spdk-raid.sock 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3465370 ']' 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:38.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:38.639 22:23:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.899 [2024-07-12 22:23:48.985179] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:16:38.899 [2024-07-12 22:23:48.985244] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3465370 ] 00:16:38.899 [2024-07-12 22:23:49.114241] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:38.899 [2024-07-12 22:23:49.216448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:39.159 [2024-07-12 22:23:49.275026] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:39.159 [2024-07-12 22:23:49.275061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:39.727 22:23:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:39.727 22:23:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:39.727 22:23:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:39.728 22:23:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:39.987 BaseBdev1_malloc 00:16:39.987 22:23:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:40.247 true 00:16:40.247 22:23:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:40.506 [2024-07-12 22:23:50.638715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:40.506 [2024-07-12 22:23:50.638763] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:40.506 [2024-07-12 22:23:50.638785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11220d0 00:16:40.506 [2024-07-12 22:23:50.638797] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:40.506 [2024-07-12 22:23:50.640734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:40.506 [2024-07-12 22:23:50.640764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:40.506 BaseBdev1 00:16:40.506 22:23:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:40.506 22:23:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:40.765 BaseBdev2_malloc 00:16:40.765 22:23:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:41.024 true 00:16:41.024 22:23:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:41.024 [2024-07-12 22:23:51.305048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:41.024 [2024-07-12 22:23:51.305091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:41.024 [2024-07-12 22:23:51.305111] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1126910 00:16:41.024 [2024-07-12 22:23:51.305124] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:41.024 [2024-07-12 22:23:51.306715] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:41.024 [2024-07-12 22:23:51.306742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:41.024 BaseBdev2 00:16:41.024 22:23:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:41.024 22:23:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:41.284 BaseBdev3_malloc 00:16:41.284 22:23:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:41.542 true 00:16:41.542 22:23:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:41.802 [2024-07-12 22:23:52.040823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:41.802 [2024-07-12 22:23:52.040872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:41.802 [2024-07-12 22:23:52.040892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1128bd0 00:16:41.802 [2024-07-12 22:23:52.040905] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:41.802 [2024-07-12 22:23:52.042542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:41.802 [2024-07-12 22:23:52.042569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:41.802 BaseBdev3 00:16:41.802 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:42.061 [2024-07-12 22:23:52.281489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:42.061 [2024-07-12 22:23:52.282859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:42.061 [2024-07-12 22:23:52.282942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.061 [2024-07-12 22:23:52.283159] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x112a280 00:16:42.061 [2024-07-12 22:23:52.283173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:42.061 [2024-07-12 22:23:52.283374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1129e20 00:16:42.061 [2024-07-12 22:23:52.283524] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x112a280 00:16:42.061 [2024-07-12 22:23:52.283534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x112a280 00:16:42.061 [2024-07-12 22:23:52.283641] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.061 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:42.320 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.320 "name": "raid_bdev1", 00:16:42.320 "uuid": "65994f65-17e0-486d-bc16-61fee064ed97", 00:16:42.320 "strip_size_kb": 64, 00:16:42.320 "state": "online", 00:16:42.320 "raid_level": "concat", 00:16:42.320 "superblock": true, 00:16:42.320 "num_base_bdevs": 3, 00:16:42.320 "num_base_bdevs_discovered": 3, 00:16:42.320 "num_base_bdevs_operational": 3, 00:16:42.320 "base_bdevs_list": [ 00:16:42.320 { 00:16:42.320 "name": "BaseBdev1", 00:16:42.320 "uuid": "683af0b1-eb15-59b8-b250-f8c5da60eca9", 00:16:42.320 "is_configured": true, 00:16:42.320 "data_offset": 2048, 00:16:42.320 "data_size": 63488 00:16:42.320 }, 00:16:42.320 { 00:16:42.320 "name": "BaseBdev2", 00:16:42.320 "uuid": "0d5f0317-7e60-520a-a818-f951171000b3", 00:16:42.320 "is_configured": true, 00:16:42.320 "data_offset": 2048, 00:16:42.320 "data_size": 63488 00:16:42.320 }, 00:16:42.320 { 00:16:42.320 "name": "BaseBdev3", 00:16:42.320 "uuid": "77d11820-88ea-5d43-99c7-80f281f4caab", 00:16:42.320 "is_configured": true, 00:16:42.320 "data_offset": 2048, 00:16:42.320 "data_size": 63488 00:16:42.320 } 00:16:42.320 ] 00:16:42.320 }' 00:16:42.320 22:23:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.320 22:23:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.887 22:23:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:42.887 22:23:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:43.146 [2024-07-12 22:23:53.216253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf784d0 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.084 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:44.344 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.344 "name": "raid_bdev1", 00:16:44.344 "uuid": "65994f65-17e0-486d-bc16-61fee064ed97", 00:16:44.344 "strip_size_kb": 64, 00:16:44.344 "state": "online", 00:16:44.344 "raid_level": "concat", 00:16:44.344 "superblock": true, 00:16:44.344 "num_base_bdevs": 3, 00:16:44.344 "num_base_bdevs_discovered": 3, 00:16:44.344 "num_base_bdevs_operational": 3, 00:16:44.344 "base_bdevs_list": [ 00:16:44.344 { 00:16:44.344 "name": "BaseBdev1", 00:16:44.344 "uuid": "683af0b1-eb15-59b8-b250-f8c5da60eca9", 00:16:44.344 "is_configured": true, 00:16:44.344 "data_offset": 2048, 00:16:44.344 "data_size": 63488 00:16:44.344 }, 00:16:44.344 { 00:16:44.344 "name": "BaseBdev2", 00:16:44.344 "uuid": "0d5f0317-7e60-520a-a818-f951171000b3", 00:16:44.344 "is_configured": true, 00:16:44.344 "data_offset": 2048, 00:16:44.344 "data_size": 63488 00:16:44.344 }, 00:16:44.344 { 00:16:44.344 "name": "BaseBdev3", 00:16:44.344 "uuid": "77d11820-88ea-5d43-99c7-80f281f4caab", 00:16:44.344 "is_configured": true, 00:16:44.344 "data_offset": 2048, 00:16:44.344 "data_size": 63488 00:16:44.344 } 00:16:44.344 ] 00:16:44.344 }' 00:16:44.344 22:23:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.344 22:23:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.912 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:45.201 [2024-07-12 22:23:55.425582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:45.201 [2024-07-12 22:23:55.425620] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:45.201 [2024-07-12 22:23:55.428887] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:45.201 [2024-07-12 22:23:55.428933] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:45.201 [2024-07-12 22:23:55.428967] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:45.202 [2024-07-12 22:23:55.428979] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x112a280 name raid_bdev1, state offline 00:16:45.202 0 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3465370 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3465370 ']' 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3465370 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3465370 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3465370' 00:16:45.202 killing process with pid 3465370 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3465370 00:16:45.202 [2024-07-12 22:23:55.491885] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:45.202 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3465370 00:16:45.202 [2024-07-12 22:23:55.512844] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TdSO7lxAg0 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:45.487 00:16:45.487 real 0m6.844s 00:16:45.487 user 0m10.823s 00:16:45.487 sys 0m1.176s 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:45.487 22:23:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.487 ************************************ 00:16:45.487 END TEST raid_read_error_test 00:16:45.487 ************************************ 00:16:45.487 22:23:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:45.487 22:23:55 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:45.487 22:23:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:45.487 22:23:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:45.487 22:23:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:45.746 ************************************ 00:16:45.746 START TEST raid_write_error_test 00:16:45.746 ************************************ 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:45.746 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.1ODRv0FH6q 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3466354 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3466354 /var/tmp/spdk-raid.sock 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3466354 ']' 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:45.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:45.747 22:23:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.747 [2024-07-12 22:23:55.963333] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:16:45.747 [2024-07-12 22:23:55.963468] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3466354 ] 00:16:46.005 [2024-07-12 22:23:56.156694] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:46.005 [2024-07-12 22:23:56.252973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:46.005 [2024-07-12 22:23:56.321739] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.005 [2024-07-12 22:23:56.321769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.574 22:23:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:46.574 22:23:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:46.574 22:23:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:46.574 22:23:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:46.833 BaseBdev1_malloc 00:16:46.833 22:23:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:47.091 true 00:16:47.091 22:23:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:47.350 [2024-07-12 22:23:57.558079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:47.350 [2024-07-12 22:23:57.558126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:47.350 [2024-07-12 22:23:57.558148] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17780d0 00:16:47.350 [2024-07-12 22:23:57.558160] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:47.350 [2024-07-12 22:23:57.560093] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:47.350 [2024-07-12 22:23:57.560121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:47.350 BaseBdev1 00:16:47.350 22:23:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:47.350 22:23:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:47.609 BaseBdev2_malloc 00:16:47.609 22:23:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:47.867 true 00:16:47.867 22:23:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:48.126 [2024-07-12 22:23:58.308708] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:48.126 [2024-07-12 22:23:58.308751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.126 [2024-07-12 22:23:58.308771] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x177c910 00:16:48.126 [2024-07-12 22:23:58.308784] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.126 [2024-07-12 22:23:58.310248] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.126 [2024-07-12 22:23:58.310276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:48.126 BaseBdev2 00:16:48.126 22:23:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:48.126 22:23:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:48.385 BaseBdev3_malloc 00:16:48.385 22:23:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:48.645 true 00:16:48.645 22:23:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:48.903 [2024-07-12 22:23:59.052260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:48.903 [2024-07-12 22:23:59.052308] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:48.903 [2024-07-12 22:23:59.052329] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x177ebd0 00:16:48.903 [2024-07-12 22:23:59.052342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:48.903 [2024-07-12 22:23:59.053977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:48.903 [2024-07-12 22:23:59.054006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:48.903 BaseBdev3 00:16:48.903 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:49.161 [2024-07-12 22:23:59.292938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:49.161 [2024-07-12 22:23:59.294326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:49.161 [2024-07-12 22:23:59.294396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:49.161 [2024-07-12 22:23:59.294611] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1780280 00:16:49.161 [2024-07-12 22:23:59.294623] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:49.161 [2024-07-12 22:23:59.294822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x177fe20 00:16:49.161 [2024-07-12 22:23:59.294983] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1780280 00:16:49.161 [2024-07-12 22:23:59.294993] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1780280 00:16:49.161 [2024-07-12 22:23:59.295096] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.161 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.162 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:49.420 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.420 "name": "raid_bdev1", 00:16:49.420 "uuid": "de459e81-293a-4d2d-b470-c09d9e5173ab", 00:16:49.420 "strip_size_kb": 64, 00:16:49.420 "state": "online", 00:16:49.420 "raid_level": "concat", 00:16:49.420 "superblock": true, 00:16:49.420 "num_base_bdevs": 3, 00:16:49.420 "num_base_bdevs_discovered": 3, 00:16:49.420 "num_base_bdevs_operational": 3, 00:16:49.420 "base_bdevs_list": [ 00:16:49.420 { 00:16:49.420 "name": "BaseBdev1", 00:16:49.420 "uuid": "e375098b-3566-54fe-a84a-521b46be0320", 00:16:49.420 "is_configured": true, 00:16:49.420 "data_offset": 2048, 00:16:49.420 "data_size": 63488 00:16:49.420 }, 00:16:49.420 { 00:16:49.420 "name": "BaseBdev2", 00:16:49.420 "uuid": "c3427775-5a47-5c8d-9957-5d4103550e2e", 00:16:49.420 "is_configured": true, 00:16:49.420 "data_offset": 2048, 00:16:49.420 "data_size": 63488 00:16:49.420 }, 00:16:49.420 { 00:16:49.420 "name": "BaseBdev3", 00:16:49.420 "uuid": "9ac12cca-3f71-58d8-934b-2504d5aeed7d", 00:16:49.420 "is_configured": true, 00:16:49.420 "data_offset": 2048, 00:16:49.420 "data_size": 63488 00:16:49.420 } 00:16:49.420 ] 00:16:49.420 }' 00:16:49.420 22:23:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.420 22:23:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.987 22:24:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:49.987 22:24:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:49.987 [2024-07-12 22:24:00.255771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ce4d0 00:16:50.924 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.183 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:51.442 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.442 "name": "raid_bdev1", 00:16:51.442 "uuid": "de459e81-293a-4d2d-b470-c09d9e5173ab", 00:16:51.442 "strip_size_kb": 64, 00:16:51.442 "state": "online", 00:16:51.442 "raid_level": "concat", 00:16:51.442 "superblock": true, 00:16:51.442 "num_base_bdevs": 3, 00:16:51.442 "num_base_bdevs_discovered": 3, 00:16:51.442 "num_base_bdevs_operational": 3, 00:16:51.442 "base_bdevs_list": [ 00:16:51.442 { 00:16:51.442 "name": "BaseBdev1", 00:16:51.442 "uuid": "e375098b-3566-54fe-a84a-521b46be0320", 00:16:51.442 "is_configured": true, 00:16:51.442 "data_offset": 2048, 00:16:51.442 "data_size": 63488 00:16:51.442 }, 00:16:51.442 { 00:16:51.442 "name": "BaseBdev2", 00:16:51.442 "uuid": "c3427775-5a47-5c8d-9957-5d4103550e2e", 00:16:51.442 "is_configured": true, 00:16:51.442 "data_offset": 2048, 00:16:51.442 "data_size": 63488 00:16:51.442 }, 00:16:51.442 { 00:16:51.442 "name": "BaseBdev3", 00:16:51.442 "uuid": "9ac12cca-3f71-58d8-934b-2504d5aeed7d", 00:16:51.442 "is_configured": true, 00:16:51.442 "data_offset": 2048, 00:16:51.442 "data_size": 63488 00:16:51.442 } 00:16:51.442 ] 00:16:51.442 }' 00:16:51.442 22:24:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.442 22:24:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.009 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:52.268 [2024-07-12 22:24:02.488961] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:52.268 [2024-07-12 22:24:02.489002] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:52.268 [2024-07-12 22:24:02.492247] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:52.268 [2024-07-12 22:24:02.492285] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.268 [2024-07-12 22:24:02.492320] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:52.268 [2024-07-12 22:24:02.492330] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1780280 name raid_bdev1, state offline 00:16:52.268 0 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3466354 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3466354 ']' 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3466354 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3466354 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3466354' 00:16:52.268 killing process with pid 3466354 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3466354 00:16:52.268 [2024-07-12 22:24:02.560279] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:52.268 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3466354 00:16:52.268 [2024-07-12 22:24:02.581804] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.1ODRv0FH6q 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:52.527 00:16:52.527 real 0m6.977s 00:16:52.527 user 0m10.986s 00:16:52.527 sys 0m1.280s 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:52.527 22:24:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.527 ************************************ 00:16:52.527 END TEST raid_write_error_test 00:16:52.527 ************************************ 00:16:52.784 22:24:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:52.784 22:24:02 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:52.784 22:24:02 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:52.784 22:24:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:52.784 22:24:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:52.784 22:24:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:52.784 ************************************ 00:16:52.784 START TEST raid_state_function_test 00:16:52.784 ************************************ 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3467457 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3467457' 00:16:52.784 Process raid pid: 3467457 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3467457 /var/tmp/spdk-raid.sock 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3467457 ']' 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:52.784 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:52.784 22:24:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.784 [2024-07-12 22:24:02.960783] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:16:52.784 [2024-07-12 22:24:02.960852] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:52.784 [2024-07-12 22:24:03.090114] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:53.043 [2024-07-12 22:24:03.194582] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:53.043 [2024-07-12 22:24:03.254931] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.043 [2024-07-12 22:24:03.254958] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.610 22:24:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:53.610 22:24:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:53.610 22:24:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:53.870 [2024-07-12 22:24:04.090416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:53.870 [2024-07-12 22:24:04.090460] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:53.870 [2024-07-12 22:24:04.090471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:53.870 [2024-07-12 22:24:04.090483] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:53.870 [2024-07-12 22:24:04.090492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:53.870 [2024-07-12 22:24:04.090504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.870 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.129 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.129 "name": "Existed_Raid", 00:16:54.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.129 "strip_size_kb": 0, 00:16:54.129 "state": "configuring", 00:16:54.129 "raid_level": "raid1", 00:16:54.129 "superblock": false, 00:16:54.129 "num_base_bdevs": 3, 00:16:54.129 "num_base_bdevs_discovered": 0, 00:16:54.129 "num_base_bdevs_operational": 3, 00:16:54.129 "base_bdevs_list": [ 00:16:54.129 { 00:16:54.129 "name": "BaseBdev1", 00:16:54.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.129 "is_configured": false, 00:16:54.129 "data_offset": 0, 00:16:54.129 "data_size": 0 00:16:54.129 }, 00:16:54.129 { 00:16:54.129 "name": "BaseBdev2", 00:16:54.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.129 "is_configured": false, 00:16:54.129 "data_offset": 0, 00:16:54.129 "data_size": 0 00:16:54.129 }, 00:16:54.129 { 00:16:54.129 "name": "BaseBdev3", 00:16:54.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.129 "is_configured": false, 00:16:54.129 "data_offset": 0, 00:16:54.129 "data_size": 0 00:16:54.129 } 00:16:54.129 ] 00:16:54.129 }' 00:16:54.129 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.129 22:24:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.699 22:24:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:54.958 [2024-07-12 22:24:05.133060] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:54.958 [2024-07-12 22:24:05.133098] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacba80 name Existed_Raid, state configuring 00:16:54.958 22:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:55.217 [2024-07-12 22:24:05.381722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:55.217 [2024-07-12 22:24:05.381758] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:55.217 [2024-07-12 22:24:05.381768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:55.217 [2024-07-12 22:24:05.381780] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:55.217 [2024-07-12 22:24:05.381789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:55.218 [2024-07-12 22:24:05.381800] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:55.218 22:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:55.477 [2024-07-12 22:24:05.636167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:55.477 BaseBdev1 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.477 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.736 22:24:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:55.996 [ 00:16:55.996 { 00:16:55.996 "name": "BaseBdev1", 00:16:55.996 "aliases": [ 00:16:55.996 "cdb3660b-ffd5-4d71-b346-01fc5f27f08f" 00:16:55.996 ], 00:16:55.996 "product_name": "Malloc disk", 00:16:55.996 "block_size": 512, 00:16:55.996 "num_blocks": 65536, 00:16:55.996 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:16:55.996 "assigned_rate_limits": { 00:16:55.996 "rw_ios_per_sec": 0, 00:16:55.996 "rw_mbytes_per_sec": 0, 00:16:55.996 "r_mbytes_per_sec": 0, 00:16:55.996 "w_mbytes_per_sec": 0 00:16:55.996 }, 00:16:55.996 "claimed": true, 00:16:55.996 "claim_type": "exclusive_write", 00:16:55.996 "zoned": false, 00:16:55.996 "supported_io_types": { 00:16:55.996 "read": true, 00:16:55.996 "write": true, 00:16:55.996 "unmap": true, 00:16:55.996 "flush": true, 00:16:55.996 "reset": true, 00:16:55.996 "nvme_admin": false, 00:16:55.996 "nvme_io": false, 00:16:55.996 "nvme_io_md": false, 00:16:55.996 "write_zeroes": true, 00:16:55.996 "zcopy": true, 00:16:55.996 "get_zone_info": false, 00:16:55.996 "zone_management": false, 00:16:55.996 "zone_append": false, 00:16:55.996 "compare": false, 00:16:55.996 "compare_and_write": false, 00:16:55.996 "abort": true, 00:16:55.996 "seek_hole": false, 00:16:55.996 "seek_data": false, 00:16:55.996 "copy": true, 00:16:55.996 "nvme_iov_md": false 00:16:55.996 }, 00:16:55.996 "memory_domains": [ 00:16:55.996 { 00:16:55.996 "dma_device_id": "system", 00:16:55.996 "dma_device_type": 1 00:16:55.996 }, 00:16:55.996 { 00:16:55.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.996 "dma_device_type": 2 00:16:55.996 } 00:16:55.996 ], 00:16:55.996 "driver_specific": {} 00:16:55.996 } 00:16:55.996 ] 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.996 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.256 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.256 "name": "Existed_Raid", 00:16:56.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.256 "strip_size_kb": 0, 00:16:56.256 "state": "configuring", 00:16:56.256 "raid_level": "raid1", 00:16:56.256 "superblock": false, 00:16:56.256 "num_base_bdevs": 3, 00:16:56.256 "num_base_bdevs_discovered": 1, 00:16:56.256 "num_base_bdevs_operational": 3, 00:16:56.256 "base_bdevs_list": [ 00:16:56.256 { 00:16:56.256 "name": "BaseBdev1", 00:16:56.256 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:16:56.256 "is_configured": true, 00:16:56.256 "data_offset": 0, 00:16:56.256 "data_size": 65536 00:16:56.256 }, 00:16:56.256 { 00:16:56.256 "name": "BaseBdev2", 00:16:56.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.256 "is_configured": false, 00:16:56.256 "data_offset": 0, 00:16:56.256 "data_size": 0 00:16:56.256 }, 00:16:56.256 { 00:16:56.256 "name": "BaseBdev3", 00:16:56.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.256 "is_configured": false, 00:16:56.256 "data_offset": 0, 00:16:56.256 "data_size": 0 00:16:56.256 } 00:16:56.256 ] 00:16:56.256 }' 00:16:56.256 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.256 22:24:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.825 22:24:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:56.825 [2024-07-12 22:24:07.140144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:56.825 [2024-07-12 22:24:07.140182] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacb310 name Existed_Raid, state configuring 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:57.085 [2024-07-12 22:24:07.380807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:57.085 [2024-07-12 22:24:07.382260] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:57.085 [2024-07-12 22:24:07.382293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:57.085 [2024-07-12 22:24:07.382304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:57.085 [2024-07-12 22:24:07.382316] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.085 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.345 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.345 "name": "Existed_Raid", 00:16:57.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.345 "strip_size_kb": 0, 00:16:57.345 "state": "configuring", 00:16:57.345 "raid_level": "raid1", 00:16:57.345 "superblock": false, 00:16:57.345 "num_base_bdevs": 3, 00:16:57.345 "num_base_bdevs_discovered": 1, 00:16:57.345 "num_base_bdevs_operational": 3, 00:16:57.345 "base_bdevs_list": [ 00:16:57.345 { 00:16:57.345 "name": "BaseBdev1", 00:16:57.345 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:16:57.345 "is_configured": true, 00:16:57.345 "data_offset": 0, 00:16:57.345 "data_size": 65536 00:16:57.345 }, 00:16:57.345 { 00:16:57.345 "name": "BaseBdev2", 00:16:57.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.345 "is_configured": false, 00:16:57.345 "data_offset": 0, 00:16:57.345 "data_size": 0 00:16:57.345 }, 00:16:57.345 { 00:16:57.345 "name": "BaseBdev3", 00:16:57.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.345 "is_configured": false, 00:16:57.345 "data_offset": 0, 00:16:57.345 "data_size": 0 00:16:57.345 } 00:16:57.345 ] 00:16:57.345 }' 00:16:57.345 22:24:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.345 22:24:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:58.282 [2024-07-12 22:24:08.402914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:58.282 BaseBdev2 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:58.282 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:58.542 [ 00:16:58.542 { 00:16:58.542 "name": "BaseBdev2", 00:16:58.542 "aliases": [ 00:16:58.542 "e9457893-9fdd-4fd1-9633-2dd10e992ecd" 00:16:58.542 ], 00:16:58.542 "product_name": "Malloc disk", 00:16:58.542 "block_size": 512, 00:16:58.542 "num_blocks": 65536, 00:16:58.542 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:16:58.542 "assigned_rate_limits": { 00:16:58.542 "rw_ios_per_sec": 0, 00:16:58.542 "rw_mbytes_per_sec": 0, 00:16:58.542 "r_mbytes_per_sec": 0, 00:16:58.542 "w_mbytes_per_sec": 0 00:16:58.542 }, 00:16:58.542 "claimed": true, 00:16:58.542 "claim_type": "exclusive_write", 00:16:58.542 "zoned": false, 00:16:58.542 "supported_io_types": { 00:16:58.542 "read": true, 00:16:58.542 "write": true, 00:16:58.542 "unmap": true, 00:16:58.542 "flush": true, 00:16:58.542 "reset": true, 00:16:58.542 "nvme_admin": false, 00:16:58.542 "nvme_io": false, 00:16:58.542 "nvme_io_md": false, 00:16:58.542 "write_zeroes": true, 00:16:58.542 "zcopy": true, 00:16:58.542 "get_zone_info": false, 00:16:58.542 "zone_management": false, 00:16:58.542 "zone_append": false, 00:16:58.542 "compare": false, 00:16:58.542 "compare_and_write": false, 00:16:58.542 "abort": true, 00:16:58.542 "seek_hole": false, 00:16:58.542 "seek_data": false, 00:16:58.542 "copy": true, 00:16:58.542 "nvme_iov_md": false 00:16:58.542 }, 00:16:58.542 "memory_domains": [ 00:16:58.542 { 00:16:58.542 "dma_device_id": "system", 00:16:58.542 "dma_device_type": 1 00:16:58.542 }, 00:16:58.542 { 00:16:58.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.542 "dma_device_type": 2 00:16:58.542 } 00:16:58.542 ], 00:16:58.542 "driver_specific": {} 00:16:58.542 } 00:16:58.542 ] 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.542 22:24:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.802 22:24:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.802 "name": "Existed_Raid", 00:16:58.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.802 "strip_size_kb": 0, 00:16:58.802 "state": "configuring", 00:16:58.802 "raid_level": "raid1", 00:16:58.802 "superblock": false, 00:16:58.802 "num_base_bdevs": 3, 00:16:58.802 "num_base_bdevs_discovered": 2, 00:16:58.802 "num_base_bdevs_operational": 3, 00:16:58.802 "base_bdevs_list": [ 00:16:58.802 { 00:16:58.802 "name": "BaseBdev1", 00:16:58.802 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:16:58.802 "is_configured": true, 00:16:58.802 "data_offset": 0, 00:16:58.802 "data_size": 65536 00:16:58.802 }, 00:16:58.802 { 00:16:58.802 "name": "BaseBdev2", 00:16:58.802 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:16:58.802 "is_configured": true, 00:16:58.802 "data_offset": 0, 00:16:58.802 "data_size": 65536 00:16:58.802 }, 00:16:58.802 { 00:16:58.802 "name": "BaseBdev3", 00:16:58.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.802 "is_configured": false, 00:16:58.802 "data_offset": 0, 00:16:58.802 "data_size": 0 00:16:58.802 } 00:16:58.802 ] 00:16:58.802 }' 00:16:58.802 22:24:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.802 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.371 22:24:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:59.630 [2024-07-12 22:24:09.705747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:59.630 [2024-07-12 22:24:09.705784] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xacc400 00:16:59.630 [2024-07-12 22:24:09.705793] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:59.630 [2024-07-12 22:24:09.706053] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacbef0 00:16:59.630 [2024-07-12 22:24:09.706179] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xacc400 00:16:59.630 [2024-07-12 22:24:09.706189] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xacc400 00:16:59.630 [2024-07-12 22:24:09.706353] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:59.630 BaseBdev3 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:59.630 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.889 22:24:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:59.889 [ 00:16:59.889 { 00:16:59.889 "name": "BaseBdev3", 00:16:59.889 "aliases": [ 00:16:59.889 "916cb0bb-90bc-4021-9b8b-0453db4428f4" 00:16:59.889 ], 00:16:59.889 "product_name": "Malloc disk", 00:16:59.889 "block_size": 512, 00:16:59.889 "num_blocks": 65536, 00:16:59.889 "uuid": "916cb0bb-90bc-4021-9b8b-0453db4428f4", 00:16:59.889 "assigned_rate_limits": { 00:16:59.889 "rw_ios_per_sec": 0, 00:16:59.889 "rw_mbytes_per_sec": 0, 00:16:59.889 "r_mbytes_per_sec": 0, 00:16:59.889 "w_mbytes_per_sec": 0 00:16:59.889 }, 00:16:59.889 "claimed": true, 00:16:59.889 "claim_type": "exclusive_write", 00:16:59.889 "zoned": false, 00:16:59.889 "supported_io_types": { 00:16:59.889 "read": true, 00:16:59.889 "write": true, 00:16:59.889 "unmap": true, 00:16:59.889 "flush": true, 00:16:59.889 "reset": true, 00:16:59.889 "nvme_admin": false, 00:16:59.889 "nvme_io": false, 00:16:59.889 "nvme_io_md": false, 00:16:59.889 "write_zeroes": true, 00:16:59.889 "zcopy": true, 00:16:59.889 "get_zone_info": false, 00:16:59.889 "zone_management": false, 00:16:59.889 "zone_append": false, 00:16:59.889 "compare": false, 00:16:59.889 "compare_and_write": false, 00:16:59.889 "abort": true, 00:16:59.889 "seek_hole": false, 00:16:59.889 "seek_data": false, 00:16:59.889 "copy": true, 00:16:59.889 "nvme_iov_md": false 00:16:59.889 }, 00:16:59.889 "memory_domains": [ 00:16:59.889 { 00:16:59.889 "dma_device_id": "system", 00:16:59.889 "dma_device_type": 1 00:16:59.889 }, 00:16:59.889 { 00:16:59.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.889 "dma_device_type": 2 00:16:59.889 } 00:16:59.889 ], 00:16:59.889 "driver_specific": {} 00:16:59.889 } 00:16:59.889 ] 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.149 "name": "Existed_Raid", 00:17:00.149 "uuid": "66cbdfc1-02ac-4896-b556-19475e363815", 00:17:00.149 "strip_size_kb": 0, 00:17:00.149 "state": "online", 00:17:00.149 "raid_level": "raid1", 00:17:00.149 "superblock": false, 00:17:00.149 "num_base_bdevs": 3, 00:17:00.149 "num_base_bdevs_discovered": 3, 00:17:00.149 "num_base_bdevs_operational": 3, 00:17:00.149 "base_bdevs_list": [ 00:17:00.149 { 00:17:00.149 "name": "BaseBdev1", 00:17:00.149 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:17:00.149 "is_configured": true, 00:17:00.149 "data_offset": 0, 00:17:00.149 "data_size": 65536 00:17:00.149 }, 00:17:00.149 { 00:17:00.149 "name": "BaseBdev2", 00:17:00.149 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:17:00.149 "is_configured": true, 00:17:00.149 "data_offset": 0, 00:17:00.149 "data_size": 65536 00:17:00.149 }, 00:17:00.149 { 00:17:00.149 "name": "BaseBdev3", 00:17:00.149 "uuid": "916cb0bb-90bc-4021-9b8b-0453db4428f4", 00:17:00.149 "is_configured": true, 00:17:00.149 "data_offset": 0, 00:17:00.149 "data_size": 65536 00:17:00.149 } 00:17:00.149 ] 00:17:00.149 }' 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.149 22:24:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:00.717 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:00.977 [2024-07-12 22:24:11.230104] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:00.977 "name": "Existed_Raid", 00:17:00.977 "aliases": [ 00:17:00.977 "66cbdfc1-02ac-4896-b556-19475e363815" 00:17:00.977 ], 00:17:00.977 "product_name": "Raid Volume", 00:17:00.977 "block_size": 512, 00:17:00.977 "num_blocks": 65536, 00:17:00.977 "uuid": "66cbdfc1-02ac-4896-b556-19475e363815", 00:17:00.977 "assigned_rate_limits": { 00:17:00.977 "rw_ios_per_sec": 0, 00:17:00.977 "rw_mbytes_per_sec": 0, 00:17:00.977 "r_mbytes_per_sec": 0, 00:17:00.977 "w_mbytes_per_sec": 0 00:17:00.977 }, 00:17:00.977 "claimed": false, 00:17:00.977 "zoned": false, 00:17:00.977 "supported_io_types": { 00:17:00.977 "read": true, 00:17:00.977 "write": true, 00:17:00.977 "unmap": false, 00:17:00.977 "flush": false, 00:17:00.977 "reset": true, 00:17:00.977 "nvme_admin": false, 00:17:00.977 "nvme_io": false, 00:17:00.977 "nvme_io_md": false, 00:17:00.977 "write_zeroes": true, 00:17:00.977 "zcopy": false, 00:17:00.977 "get_zone_info": false, 00:17:00.977 "zone_management": false, 00:17:00.977 "zone_append": false, 00:17:00.977 "compare": false, 00:17:00.977 "compare_and_write": false, 00:17:00.977 "abort": false, 00:17:00.977 "seek_hole": false, 00:17:00.977 "seek_data": false, 00:17:00.977 "copy": false, 00:17:00.977 "nvme_iov_md": false 00:17:00.977 }, 00:17:00.977 "memory_domains": [ 00:17:00.977 { 00:17:00.977 "dma_device_id": "system", 00:17:00.977 "dma_device_type": 1 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.977 "dma_device_type": 2 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "dma_device_id": "system", 00:17:00.977 "dma_device_type": 1 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.977 "dma_device_type": 2 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "dma_device_id": "system", 00:17:00.977 "dma_device_type": 1 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.977 "dma_device_type": 2 00:17:00.977 } 00:17:00.977 ], 00:17:00.977 "driver_specific": { 00:17:00.977 "raid": { 00:17:00.977 "uuid": "66cbdfc1-02ac-4896-b556-19475e363815", 00:17:00.977 "strip_size_kb": 0, 00:17:00.977 "state": "online", 00:17:00.977 "raid_level": "raid1", 00:17:00.977 "superblock": false, 00:17:00.977 "num_base_bdevs": 3, 00:17:00.977 "num_base_bdevs_discovered": 3, 00:17:00.977 "num_base_bdevs_operational": 3, 00:17:00.977 "base_bdevs_list": [ 00:17:00.977 { 00:17:00.977 "name": "BaseBdev1", 00:17:00.977 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:17:00.977 "is_configured": true, 00:17:00.977 "data_offset": 0, 00:17:00.977 "data_size": 65536 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "name": "BaseBdev2", 00:17:00.977 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:17:00.977 "is_configured": true, 00:17:00.977 "data_offset": 0, 00:17:00.977 "data_size": 65536 00:17:00.977 }, 00:17:00.977 { 00:17:00.977 "name": "BaseBdev3", 00:17:00.977 "uuid": "916cb0bb-90bc-4021-9b8b-0453db4428f4", 00:17:00.977 "is_configured": true, 00:17:00.977 "data_offset": 0, 00:17:00.977 "data_size": 65536 00:17:00.977 } 00:17:00.977 ] 00:17:00.977 } 00:17:00.977 } 00:17:00.977 }' 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:00.977 BaseBdev2 00:17:00.977 BaseBdev3' 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:00.977 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.237 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.237 "name": "BaseBdev1", 00:17:01.237 "aliases": [ 00:17:01.237 "cdb3660b-ffd5-4d71-b346-01fc5f27f08f" 00:17:01.237 ], 00:17:01.237 "product_name": "Malloc disk", 00:17:01.237 "block_size": 512, 00:17:01.237 "num_blocks": 65536, 00:17:01.237 "uuid": "cdb3660b-ffd5-4d71-b346-01fc5f27f08f", 00:17:01.237 "assigned_rate_limits": { 00:17:01.237 "rw_ios_per_sec": 0, 00:17:01.237 "rw_mbytes_per_sec": 0, 00:17:01.237 "r_mbytes_per_sec": 0, 00:17:01.237 "w_mbytes_per_sec": 0 00:17:01.237 }, 00:17:01.237 "claimed": true, 00:17:01.237 "claim_type": "exclusive_write", 00:17:01.237 "zoned": false, 00:17:01.237 "supported_io_types": { 00:17:01.237 "read": true, 00:17:01.237 "write": true, 00:17:01.237 "unmap": true, 00:17:01.237 "flush": true, 00:17:01.237 "reset": true, 00:17:01.237 "nvme_admin": false, 00:17:01.237 "nvme_io": false, 00:17:01.237 "nvme_io_md": false, 00:17:01.237 "write_zeroes": true, 00:17:01.237 "zcopy": true, 00:17:01.237 "get_zone_info": false, 00:17:01.237 "zone_management": false, 00:17:01.237 "zone_append": false, 00:17:01.237 "compare": false, 00:17:01.237 "compare_and_write": false, 00:17:01.237 "abort": true, 00:17:01.237 "seek_hole": false, 00:17:01.237 "seek_data": false, 00:17:01.237 "copy": true, 00:17:01.237 "nvme_iov_md": false 00:17:01.237 }, 00:17:01.237 "memory_domains": [ 00:17:01.237 { 00:17:01.237 "dma_device_id": "system", 00:17:01.237 "dma_device_type": 1 00:17:01.237 }, 00:17:01.237 { 00:17:01.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.237 "dma_device_type": 2 00:17:01.237 } 00:17:01.237 ], 00:17:01.237 "driver_specific": {} 00:17:01.237 }' 00:17:01.237 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.496 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.756 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.756 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.756 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.756 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.757 22:24:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.016 "name": "BaseBdev2", 00:17:02.016 "aliases": [ 00:17:02.016 "e9457893-9fdd-4fd1-9633-2dd10e992ecd" 00:17:02.016 ], 00:17:02.016 "product_name": "Malloc disk", 00:17:02.016 "block_size": 512, 00:17:02.016 "num_blocks": 65536, 00:17:02.016 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:17:02.016 "assigned_rate_limits": { 00:17:02.016 "rw_ios_per_sec": 0, 00:17:02.016 "rw_mbytes_per_sec": 0, 00:17:02.016 "r_mbytes_per_sec": 0, 00:17:02.016 "w_mbytes_per_sec": 0 00:17:02.016 }, 00:17:02.016 "claimed": true, 00:17:02.016 "claim_type": "exclusive_write", 00:17:02.016 "zoned": false, 00:17:02.016 "supported_io_types": { 00:17:02.016 "read": true, 00:17:02.016 "write": true, 00:17:02.016 "unmap": true, 00:17:02.016 "flush": true, 00:17:02.016 "reset": true, 00:17:02.016 "nvme_admin": false, 00:17:02.016 "nvme_io": false, 00:17:02.016 "nvme_io_md": false, 00:17:02.016 "write_zeroes": true, 00:17:02.016 "zcopy": true, 00:17:02.016 "get_zone_info": false, 00:17:02.016 "zone_management": false, 00:17:02.016 "zone_append": false, 00:17:02.016 "compare": false, 00:17:02.016 "compare_and_write": false, 00:17:02.016 "abort": true, 00:17:02.016 "seek_hole": false, 00:17:02.016 "seek_data": false, 00:17:02.016 "copy": true, 00:17:02.016 "nvme_iov_md": false 00:17:02.016 }, 00:17:02.016 "memory_domains": [ 00:17:02.016 { 00:17:02.016 "dma_device_id": "system", 00:17:02.016 "dma_device_type": 1 00:17:02.016 }, 00:17:02.016 { 00:17:02.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.016 "dma_device_type": 2 00:17:02.016 } 00:17:02.016 ], 00:17:02.016 "driver_specific": {} 00:17:02.016 }' 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.016 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:02.276 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.536 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.536 "name": "BaseBdev3", 00:17:02.536 "aliases": [ 00:17:02.536 "916cb0bb-90bc-4021-9b8b-0453db4428f4" 00:17:02.536 ], 00:17:02.536 "product_name": "Malloc disk", 00:17:02.536 "block_size": 512, 00:17:02.536 "num_blocks": 65536, 00:17:02.536 "uuid": "916cb0bb-90bc-4021-9b8b-0453db4428f4", 00:17:02.536 "assigned_rate_limits": { 00:17:02.536 "rw_ios_per_sec": 0, 00:17:02.536 "rw_mbytes_per_sec": 0, 00:17:02.536 "r_mbytes_per_sec": 0, 00:17:02.536 "w_mbytes_per_sec": 0 00:17:02.536 }, 00:17:02.536 "claimed": true, 00:17:02.536 "claim_type": "exclusive_write", 00:17:02.536 "zoned": false, 00:17:02.536 "supported_io_types": { 00:17:02.536 "read": true, 00:17:02.536 "write": true, 00:17:02.536 "unmap": true, 00:17:02.536 "flush": true, 00:17:02.536 "reset": true, 00:17:02.536 "nvme_admin": false, 00:17:02.536 "nvme_io": false, 00:17:02.536 "nvme_io_md": false, 00:17:02.536 "write_zeroes": true, 00:17:02.536 "zcopy": true, 00:17:02.536 "get_zone_info": false, 00:17:02.536 "zone_management": false, 00:17:02.536 "zone_append": false, 00:17:02.536 "compare": false, 00:17:02.536 "compare_and_write": false, 00:17:02.536 "abort": true, 00:17:02.536 "seek_hole": false, 00:17:02.536 "seek_data": false, 00:17:02.536 "copy": true, 00:17:02.536 "nvme_iov_md": false 00:17:02.536 }, 00:17:02.536 "memory_domains": [ 00:17:02.536 { 00:17:02.536 "dma_device_id": "system", 00:17:02.536 "dma_device_type": 1 00:17:02.536 }, 00:17:02.536 { 00:17:02.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.536 "dma_device_type": 2 00:17:02.536 } 00:17:02.536 ], 00:17:02.536 "driver_specific": {} 00:17:02.536 }' 00:17:02.536 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.536 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.536 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.536 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.794 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.794 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.794 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.794 22:24:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.794 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.794 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.794 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.794 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.794 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:03.053 [2024-07-12 22:24:13.315381] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.053 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.313 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.313 "name": "Existed_Raid", 00:17:03.313 "uuid": "66cbdfc1-02ac-4896-b556-19475e363815", 00:17:03.313 "strip_size_kb": 0, 00:17:03.313 "state": "online", 00:17:03.313 "raid_level": "raid1", 00:17:03.313 "superblock": false, 00:17:03.313 "num_base_bdevs": 3, 00:17:03.313 "num_base_bdevs_discovered": 2, 00:17:03.313 "num_base_bdevs_operational": 2, 00:17:03.313 "base_bdevs_list": [ 00:17:03.313 { 00:17:03.313 "name": null, 00:17:03.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.313 "is_configured": false, 00:17:03.313 "data_offset": 0, 00:17:03.313 "data_size": 65536 00:17:03.313 }, 00:17:03.313 { 00:17:03.313 "name": "BaseBdev2", 00:17:03.313 "uuid": "e9457893-9fdd-4fd1-9633-2dd10e992ecd", 00:17:03.313 "is_configured": true, 00:17:03.313 "data_offset": 0, 00:17:03.313 "data_size": 65536 00:17:03.313 }, 00:17:03.313 { 00:17:03.313 "name": "BaseBdev3", 00:17:03.313 "uuid": "916cb0bb-90bc-4021-9b8b-0453db4428f4", 00:17:03.313 "is_configured": true, 00:17:03.313 "data_offset": 0, 00:17:03.313 "data_size": 65536 00:17:03.313 } 00:17:03.313 ] 00:17:03.313 }' 00:17:03.313 22:24:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.313 22:24:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.882 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:03.882 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:03.882 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.882 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:04.145 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:04.145 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:04.145 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:04.484 [2024-07-12 22:24:14.643903] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:04.484 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:04.484 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:04.484 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.484 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:04.763 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:04.763 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:04.763 22:24:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:05.023 [2024-07-12 22:24:15.135803] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:05.023 [2024-07-12 22:24:15.135889] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:05.023 [2024-07-12 22:24:15.146762] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:05.023 [2024-07-12 22:24:15.146797] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:05.023 [2024-07-12 22:24:15.146809] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacc400 name Existed_Raid, state offline 00:17:05.023 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:05.023 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.023 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.023 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:05.282 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:05.540 BaseBdev2 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:05.540 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:05.798 22:24:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:05.798 [ 00:17:05.798 { 00:17:05.798 "name": "BaseBdev2", 00:17:05.798 "aliases": [ 00:17:05.798 "ff2656b9-abba-4982-8426-e142febd7056" 00:17:05.798 ], 00:17:05.798 "product_name": "Malloc disk", 00:17:05.798 "block_size": 512, 00:17:05.798 "num_blocks": 65536, 00:17:05.798 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:05.798 "assigned_rate_limits": { 00:17:05.798 "rw_ios_per_sec": 0, 00:17:05.798 "rw_mbytes_per_sec": 0, 00:17:05.798 "r_mbytes_per_sec": 0, 00:17:05.798 "w_mbytes_per_sec": 0 00:17:05.798 }, 00:17:05.798 "claimed": false, 00:17:05.798 "zoned": false, 00:17:05.798 "supported_io_types": { 00:17:05.798 "read": true, 00:17:05.798 "write": true, 00:17:05.798 "unmap": true, 00:17:05.798 "flush": true, 00:17:05.798 "reset": true, 00:17:05.798 "nvme_admin": false, 00:17:05.798 "nvme_io": false, 00:17:05.798 "nvme_io_md": false, 00:17:05.798 "write_zeroes": true, 00:17:05.798 "zcopy": true, 00:17:05.798 "get_zone_info": false, 00:17:05.798 "zone_management": false, 00:17:05.798 "zone_append": false, 00:17:05.798 "compare": false, 00:17:05.798 "compare_and_write": false, 00:17:05.798 "abort": true, 00:17:05.798 "seek_hole": false, 00:17:05.798 "seek_data": false, 00:17:05.798 "copy": true, 00:17:05.798 "nvme_iov_md": false 00:17:05.798 }, 00:17:05.798 "memory_domains": [ 00:17:05.798 { 00:17:05.798 "dma_device_id": "system", 00:17:05.798 "dma_device_type": 1 00:17:05.798 }, 00:17:05.798 { 00:17:05.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.798 "dma_device_type": 2 00:17:05.798 } 00:17:05.798 ], 00:17:05.798 "driver_specific": {} 00:17:05.798 } 00:17:05.798 ] 00:17:06.056 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:06.056 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:06.056 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:06.056 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:06.056 BaseBdev3 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:06.314 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:06.571 [ 00:17:06.571 { 00:17:06.571 "name": "BaseBdev3", 00:17:06.571 "aliases": [ 00:17:06.571 "6d0a36f8-75c6-4858-bb27-1adeaaabe197" 00:17:06.571 ], 00:17:06.571 "product_name": "Malloc disk", 00:17:06.571 "block_size": 512, 00:17:06.571 "num_blocks": 65536, 00:17:06.571 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:06.571 "assigned_rate_limits": { 00:17:06.571 "rw_ios_per_sec": 0, 00:17:06.571 "rw_mbytes_per_sec": 0, 00:17:06.571 "r_mbytes_per_sec": 0, 00:17:06.571 "w_mbytes_per_sec": 0 00:17:06.571 }, 00:17:06.571 "claimed": false, 00:17:06.571 "zoned": false, 00:17:06.572 "supported_io_types": { 00:17:06.572 "read": true, 00:17:06.572 "write": true, 00:17:06.572 "unmap": true, 00:17:06.572 "flush": true, 00:17:06.572 "reset": true, 00:17:06.572 "nvme_admin": false, 00:17:06.572 "nvme_io": false, 00:17:06.572 "nvme_io_md": false, 00:17:06.572 "write_zeroes": true, 00:17:06.572 "zcopy": true, 00:17:06.572 "get_zone_info": false, 00:17:06.572 "zone_management": false, 00:17:06.572 "zone_append": false, 00:17:06.572 "compare": false, 00:17:06.572 "compare_and_write": false, 00:17:06.572 "abort": true, 00:17:06.572 "seek_hole": false, 00:17:06.572 "seek_data": false, 00:17:06.572 "copy": true, 00:17:06.572 "nvme_iov_md": false 00:17:06.572 }, 00:17:06.572 "memory_domains": [ 00:17:06.572 { 00:17:06.572 "dma_device_id": "system", 00:17:06.572 "dma_device_type": 1 00:17:06.572 }, 00:17:06.572 { 00:17:06.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.572 "dma_device_type": 2 00:17:06.572 } 00:17:06.572 ], 00:17:06.572 "driver_specific": {} 00:17:06.572 } 00:17:06.572 ] 00:17:06.572 22:24:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:06.572 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:06.572 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:06.572 22:24:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:06.830 [2024-07-12 22:24:17.092416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:06.830 [2024-07-12 22:24:17.092459] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:06.830 [2024-07-12 22:24:17.092480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:06.830 [2024-07-12 22:24:17.093852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.830 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.088 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.088 "name": "Existed_Raid", 00:17:07.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.088 "strip_size_kb": 0, 00:17:07.088 "state": "configuring", 00:17:07.088 "raid_level": "raid1", 00:17:07.088 "superblock": false, 00:17:07.088 "num_base_bdevs": 3, 00:17:07.088 "num_base_bdevs_discovered": 2, 00:17:07.088 "num_base_bdevs_operational": 3, 00:17:07.088 "base_bdevs_list": [ 00:17:07.088 { 00:17:07.088 "name": "BaseBdev1", 00:17:07.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.088 "is_configured": false, 00:17:07.088 "data_offset": 0, 00:17:07.088 "data_size": 0 00:17:07.088 }, 00:17:07.088 { 00:17:07.088 "name": "BaseBdev2", 00:17:07.088 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:07.088 "is_configured": true, 00:17:07.088 "data_offset": 0, 00:17:07.088 "data_size": 65536 00:17:07.088 }, 00:17:07.088 { 00:17:07.088 "name": "BaseBdev3", 00:17:07.088 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:07.088 "is_configured": true, 00:17:07.088 "data_offset": 0, 00:17:07.088 "data_size": 65536 00:17:07.088 } 00:17:07.088 ] 00:17:07.088 }' 00:17:07.088 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.088 22:24:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.653 22:24:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:07.913 [2024-07-12 22:24:18.187310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.913 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.171 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.171 "name": "Existed_Raid", 00:17:08.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.171 "strip_size_kb": 0, 00:17:08.171 "state": "configuring", 00:17:08.171 "raid_level": "raid1", 00:17:08.171 "superblock": false, 00:17:08.171 "num_base_bdevs": 3, 00:17:08.171 "num_base_bdevs_discovered": 1, 00:17:08.171 "num_base_bdevs_operational": 3, 00:17:08.171 "base_bdevs_list": [ 00:17:08.171 { 00:17:08.171 "name": "BaseBdev1", 00:17:08.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.171 "is_configured": false, 00:17:08.171 "data_offset": 0, 00:17:08.171 "data_size": 0 00:17:08.171 }, 00:17:08.171 { 00:17:08.171 "name": null, 00:17:08.171 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:08.171 "is_configured": false, 00:17:08.171 "data_offset": 0, 00:17:08.171 "data_size": 65536 00:17:08.171 }, 00:17:08.171 { 00:17:08.171 "name": "BaseBdev3", 00:17:08.171 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:08.171 "is_configured": true, 00:17:08.171 "data_offset": 0, 00:17:08.171 "data_size": 65536 00:17:08.171 } 00:17:08.171 ] 00:17:08.171 }' 00:17:08.171 22:24:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.171 22:24:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.740 22:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.740 22:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:08.999 22:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:08.999 22:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:09.258 [2024-07-12 22:24:19.527446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:09.258 BaseBdev1 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:09.258 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.516 22:24:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:09.775 [ 00:17:09.775 { 00:17:09.775 "name": "BaseBdev1", 00:17:09.775 "aliases": [ 00:17:09.775 "a34ba42c-64a5-4f34-b75e-12aa00897c02" 00:17:09.775 ], 00:17:09.775 "product_name": "Malloc disk", 00:17:09.775 "block_size": 512, 00:17:09.775 "num_blocks": 65536, 00:17:09.775 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:09.775 "assigned_rate_limits": { 00:17:09.775 "rw_ios_per_sec": 0, 00:17:09.775 "rw_mbytes_per_sec": 0, 00:17:09.775 "r_mbytes_per_sec": 0, 00:17:09.775 "w_mbytes_per_sec": 0 00:17:09.775 }, 00:17:09.775 "claimed": true, 00:17:09.775 "claim_type": "exclusive_write", 00:17:09.775 "zoned": false, 00:17:09.775 "supported_io_types": { 00:17:09.775 "read": true, 00:17:09.775 "write": true, 00:17:09.775 "unmap": true, 00:17:09.775 "flush": true, 00:17:09.775 "reset": true, 00:17:09.775 "nvme_admin": false, 00:17:09.775 "nvme_io": false, 00:17:09.775 "nvme_io_md": false, 00:17:09.775 "write_zeroes": true, 00:17:09.775 "zcopy": true, 00:17:09.775 "get_zone_info": false, 00:17:09.775 "zone_management": false, 00:17:09.775 "zone_append": false, 00:17:09.775 "compare": false, 00:17:09.775 "compare_and_write": false, 00:17:09.775 "abort": true, 00:17:09.775 "seek_hole": false, 00:17:09.775 "seek_data": false, 00:17:09.775 "copy": true, 00:17:09.775 "nvme_iov_md": false 00:17:09.775 }, 00:17:09.775 "memory_domains": [ 00:17:09.775 { 00:17:09.775 "dma_device_id": "system", 00:17:09.775 "dma_device_type": 1 00:17:09.775 }, 00:17:09.775 { 00:17:09.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.775 "dma_device_type": 2 00:17:09.775 } 00:17:09.775 ], 00:17:09.775 "driver_specific": {} 00:17:09.775 } 00:17:09.775 ] 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.775 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.776 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.776 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.776 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.345 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.345 "name": "Existed_Raid", 00:17:10.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.345 "strip_size_kb": 0, 00:17:10.345 "state": "configuring", 00:17:10.345 "raid_level": "raid1", 00:17:10.345 "superblock": false, 00:17:10.345 "num_base_bdevs": 3, 00:17:10.345 "num_base_bdevs_discovered": 2, 00:17:10.345 "num_base_bdevs_operational": 3, 00:17:10.345 "base_bdevs_list": [ 00:17:10.345 { 00:17:10.345 "name": "BaseBdev1", 00:17:10.345 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:10.345 "is_configured": true, 00:17:10.345 "data_offset": 0, 00:17:10.345 "data_size": 65536 00:17:10.345 }, 00:17:10.345 { 00:17:10.345 "name": null, 00:17:10.345 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:10.345 "is_configured": false, 00:17:10.345 "data_offset": 0, 00:17:10.345 "data_size": 65536 00:17:10.345 }, 00:17:10.345 { 00:17:10.345 "name": "BaseBdev3", 00:17:10.345 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:10.345 "is_configured": true, 00:17:10.345 "data_offset": 0, 00:17:10.345 "data_size": 65536 00:17:10.345 } 00:17:10.345 ] 00:17:10.345 }' 00:17:10.345 22:24:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.345 22:24:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.913 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.913 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:11.171 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:11.171 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:11.430 [2024-07-12 22:24:21.556870] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.430 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.689 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.689 "name": "Existed_Raid", 00:17:11.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.689 "strip_size_kb": 0, 00:17:11.689 "state": "configuring", 00:17:11.689 "raid_level": "raid1", 00:17:11.689 "superblock": false, 00:17:11.689 "num_base_bdevs": 3, 00:17:11.689 "num_base_bdevs_discovered": 1, 00:17:11.689 "num_base_bdevs_operational": 3, 00:17:11.689 "base_bdevs_list": [ 00:17:11.689 { 00:17:11.689 "name": "BaseBdev1", 00:17:11.689 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:11.689 "is_configured": true, 00:17:11.689 "data_offset": 0, 00:17:11.689 "data_size": 65536 00:17:11.689 }, 00:17:11.689 { 00:17:11.689 "name": null, 00:17:11.689 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:11.689 "is_configured": false, 00:17:11.689 "data_offset": 0, 00:17:11.689 "data_size": 65536 00:17:11.689 }, 00:17:11.689 { 00:17:11.689 "name": null, 00:17:11.689 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:11.689 "is_configured": false, 00:17:11.689 "data_offset": 0, 00:17:11.689 "data_size": 65536 00:17:11.689 } 00:17:11.689 ] 00:17:11.689 }' 00:17:11.689 22:24:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.689 22:24:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.257 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.257 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:12.257 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:12.257 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:12.517 [2024-07-12 22:24:22.780127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.517 22:24:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.775 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.775 "name": "Existed_Raid", 00:17:12.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.775 "strip_size_kb": 0, 00:17:12.775 "state": "configuring", 00:17:12.775 "raid_level": "raid1", 00:17:12.775 "superblock": false, 00:17:12.775 "num_base_bdevs": 3, 00:17:12.775 "num_base_bdevs_discovered": 2, 00:17:12.775 "num_base_bdevs_operational": 3, 00:17:12.775 "base_bdevs_list": [ 00:17:12.775 { 00:17:12.775 "name": "BaseBdev1", 00:17:12.775 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:12.775 "is_configured": true, 00:17:12.775 "data_offset": 0, 00:17:12.775 "data_size": 65536 00:17:12.775 }, 00:17:12.775 { 00:17:12.775 "name": null, 00:17:12.775 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:12.775 "is_configured": false, 00:17:12.775 "data_offset": 0, 00:17:12.775 "data_size": 65536 00:17:12.775 }, 00:17:12.775 { 00:17:12.775 "name": "BaseBdev3", 00:17:12.775 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:12.775 "is_configured": true, 00:17:12.775 "data_offset": 0, 00:17:12.775 "data_size": 65536 00:17:12.775 } 00:17:12.775 ] 00:17:12.775 }' 00:17:12.775 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.775 22:24:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.342 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.342 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:13.601 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:13.601 22:24:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:13.860 [2024-07-12 22:24:24.023434] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.860 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.138 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.138 "name": "Existed_Raid", 00:17:14.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.138 "strip_size_kb": 0, 00:17:14.138 "state": "configuring", 00:17:14.138 "raid_level": "raid1", 00:17:14.138 "superblock": false, 00:17:14.138 "num_base_bdevs": 3, 00:17:14.138 "num_base_bdevs_discovered": 1, 00:17:14.138 "num_base_bdevs_operational": 3, 00:17:14.138 "base_bdevs_list": [ 00:17:14.138 { 00:17:14.138 "name": null, 00:17:14.138 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:14.138 "is_configured": false, 00:17:14.138 "data_offset": 0, 00:17:14.138 "data_size": 65536 00:17:14.138 }, 00:17:14.138 { 00:17:14.138 "name": null, 00:17:14.138 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:14.138 "is_configured": false, 00:17:14.138 "data_offset": 0, 00:17:14.138 "data_size": 65536 00:17:14.138 }, 00:17:14.138 { 00:17:14.138 "name": "BaseBdev3", 00:17:14.138 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:14.138 "is_configured": true, 00:17:14.138 "data_offset": 0, 00:17:14.138 "data_size": 65536 00:17:14.138 } 00:17:14.138 ] 00:17:14.138 }' 00:17:14.138 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.138 22:24:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.705 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.705 22:24:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:14.964 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:14.964 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:15.222 [2024-07-12 22:24:25.367379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.222 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.480 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.480 "name": "Existed_Raid", 00:17:15.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.480 "strip_size_kb": 0, 00:17:15.480 "state": "configuring", 00:17:15.480 "raid_level": "raid1", 00:17:15.480 "superblock": false, 00:17:15.481 "num_base_bdevs": 3, 00:17:15.481 "num_base_bdevs_discovered": 2, 00:17:15.481 "num_base_bdevs_operational": 3, 00:17:15.481 "base_bdevs_list": [ 00:17:15.481 { 00:17:15.481 "name": null, 00:17:15.481 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:15.481 "is_configured": false, 00:17:15.481 "data_offset": 0, 00:17:15.481 "data_size": 65536 00:17:15.481 }, 00:17:15.481 { 00:17:15.481 "name": "BaseBdev2", 00:17:15.481 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:15.481 "is_configured": true, 00:17:15.481 "data_offset": 0, 00:17:15.481 "data_size": 65536 00:17:15.481 }, 00:17:15.481 { 00:17:15.481 "name": "BaseBdev3", 00:17:15.481 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:15.481 "is_configured": true, 00:17:15.481 "data_offset": 0, 00:17:15.481 "data_size": 65536 00:17:15.481 } 00:17:15.481 ] 00:17:15.481 }' 00:17:15.481 22:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.481 22:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.048 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.048 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:16.307 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:16.307 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.307 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:16.565 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a34ba42c-64a5-4f34-b75e-12aa00897c02 00:17:16.827 [2024-07-12 22:24:26.963086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:16.827 [2024-07-12 22:24:26.963125] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xacfe40 00:17:16.827 [2024-07-12 22:24:26.963133] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:16.827 [2024-07-12 22:24:26.963319] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacce60 00:17:16.827 [2024-07-12 22:24:26.963441] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xacfe40 00:17:16.827 [2024-07-12 22:24:26.963451] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xacfe40 00:17:16.827 [2024-07-12 22:24:26.963615] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.827 NewBaseBdev 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:16.827 22:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:17.085 22:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:17.344 [ 00:17:17.344 { 00:17:17.344 "name": "NewBaseBdev", 00:17:17.345 "aliases": [ 00:17:17.345 "a34ba42c-64a5-4f34-b75e-12aa00897c02" 00:17:17.345 ], 00:17:17.345 "product_name": "Malloc disk", 00:17:17.345 "block_size": 512, 00:17:17.345 "num_blocks": 65536, 00:17:17.345 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:17.345 "assigned_rate_limits": { 00:17:17.345 "rw_ios_per_sec": 0, 00:17:17.345 "rw_mbytes_per_sec": 0, 00:17:17.345 "r_mbytes_per_sec": 0, 00:17:17.345 "w_mbytes_per_sec": 0 00:17:17.345 }, 00:17:17.345 "claimed": true, 00:17:17.345 "claim_type": "exclusive_write", 00:17:17.345 "zoned": false, 00:17:17.345 "supported_io_types": { 00:17:17.345 "read": true, 00:17:17.345 "write": true, 00:17:17.345 "unmap": true, 00:17:17.345 "flush": true, 00:17:17.345 "reset": true, 00:17:17.345 "nvme_admin": false, 00:17:17.345 "nvme_io": false, 00:17:17.345 "nvme_io_md": false, 00:17:17.345 "write_zeroes": true, 00:17:17.345 "zcopy": true, 00:17:17.345 "get_zone_info": false, 00:17:17.345 "zone_management": false, 00:17:17.345 "zone_append": false, 00:17:17.345 "compare": false, 00:17:17.345 "compare_and_write": false, 00:17:17.345 "abort": true, 00:17:17.345 "seek_hole": false, 00:17:17.345 "seek_data": false, 00:17:17.345 "copy": true, 00:17:17.345 "nvme_iov_md": false 00:17:17.345 }, 00:17:17.345 "memory_domains": [ 00:17:17.345 { 00:17:17.345 "dma_device_id": "system", 00:17:17.345 "dma_device_type": 1 00:17:17.345 }, 00:17:17.345 { 00:17:17.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.345 "dma_device_type": 2 00:17:17.345 } 00:17:17.345 ], 00:17:17.345 "driver_specific": {} 00:17:17.345 } 00:17:17.345 ] 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.345 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.604 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.604 "name": "Existed_Raid", 00:17:17.604 "uuid": "9e356b6e-30f4-46fb-a422-9cdea6180213", 00:17:17.604 "strip_size_kb": 0, 00:17:17.604 "state": "online", 00:17:17.604 "raid_level": "raid1", 00:17:17.604 "superblock": false, 00:17:17.604 "num_base_bdevs": 3, 00:17:17.604 "num_base_bdevs_discovered": 3, 00:17:17.604 "num_base_bdevs_operational": 3, 00:17:17.604 "base_bdevs_list": [ 00:17:17.604 { 00:17:17.604 "name": "NewBaseBdev", 00:17:17.604 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:17.604 "is_configured": true, 00:17:17.604 "data_offset": 0, 00:17:17.604 "data_size": 65536 00:17:17.604 }, 00:17:17.604 { 00:17:17.604 "name": "BaseBdev2", 00:17:17.604 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:17.604 "is_configured": true, 00:17:17.604 "data_offset": 0, 00:17:17.604 "data_size": 65536 00:17:17.604 }, 00:17:17.604 { 00:17:17.604 "name": "BaseBdev3", 00:17:17.604 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:17.604 "is_configured": true, 00:17:17.604 "data_offset": 0, 00:17:17.604 "data_size": 65536 00:17:17.604 } 00:17:17.604 ] 00:17:17.604 }' 00:17:17.604 22:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.604 22:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:18.171 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:18.430 [2024-07-12 22:24:28.519609] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:18.430 "name": "Existed_Raid", 00:17:18.430 "aliases": [ 00:17:18.430 "9e356b6e-30f4-46fb-a422-9cdea6180213" 00:17:18.430 ], 00:17:18.430 "product_name": "Raid Volume", 00:17:18.430 "block_size": 512, 00:17:18.430 "num_blocks": 65536, 00:17:18.430 "uuid": "9e356b6e-30f4-46fb-a422-9cdea6180213", 00:17:18.430 "assigned_rate_limits": { 00:17:18.430 "rw_ios_per_sec": 0, 00:17:18.430 "rw_mbytes_per_sec": 0, 00:17:18.430 "r_mbytes_per_sec": 0, 00:17:18.430 "w_mbytes_per_sec": 0 00:17:18.430 }, 00:17:18.430 "claimed": false, 00:17:18.430 "zoned": false, 00:17:18.430 "supported_io_types": { 00:17:18.430 "read": true, 00:17:18.430 "write": true, 00:17:18.430 "unmap": false, 00:17:18.430 "flush": false, 00:17:18.430 "reset": true, 00:17:18.430 "nvme_admin": false, 00:17:18.430 "nvme_io": false, 00:17:18.430 "nvme_io_md": false, 00:17:18.430 "write_zeroes": true, 00:17:18.430 "zcopy": false, 00:17:18.430 "get_zone_info": false, 00:17:18.430 "zone_management": false, 00:17:18.430 "zone_append": false, 00:17:18.430 "compare": false, 00:17:18.430 "compare_and_write": false, 00:17:18.430 "abort": false, 00:17:18.430 "seek_hole": false, 00:17:18.430 "seek_data": false, 00:17:18.430 "copy": false, 00:17:18.430 "nvme_iov_md": false 00:17:18.430 }, 00:17:18.430 "memory_domains": [ 00:17:18.430 { 00:17:18.430 "dma_device_id": "system", 00:17:18.430 "dma_device_type": 1 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.430 "dma_device_type": 2 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "dma_device_id": "system", 00:17:18.430 "dma_device_type": 1 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.430 "dma_device_type": 2 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "dma_device_id": "system", 00:17:18.430 "dma_device_type": 1 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.430 "dma_device_type": 2 00:17:18.430 } 00:17:18.430 ], 00:17:18.430 "driver_specific": { 00:17:18.430 "raid": { 00:17:18.430 "uuid": "9e356b6e-30f4-46fb-a422-9cdea6180213", 00:17:18.430 "strip_size_kb": 0, 00:17:18.430 "state": "online", 00:17:18.430 "raid_level": "raid1", 00:17:18.430 "superblock": false, 00:17:18.430 "num_base_bdevs": 3, 00:17:18.430 "num_base_bdevs_discovered": 3, 00:17:18.430 "num_base_bdevs_operational": 3, 00:17:18.430 "base_bdevs_list": [ 00:17:18.430 { 00:17:18.430 "name": "NewBaseBdev", 00:17:18.430 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:18.430 "is_configured": true, 00:17:18.430 "data_offset": 0, 00:17:18.430 "data_size": 65536 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "name": "BaseBdev2", 00:17:18.430 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:18.430 "is_configured": true, 00:17:18.430 "data_offset": 0, 00:17:18.430 "data_size": 65536 00:17:18.430 }, 00:17:18.430 { 00:17:18.430 "name": "BaseBdev3", 00:17:18.430 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:18.430 "is_configured": true, 00:17:18.430 "data_offset": 0, 00:17:18.430 "data_size": 65536 00:17:18.430 } 00:17:18.430 ] 00:17:18.430 } 00:17:18.430 } 00:17:18.430 }' 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:18.430 BaseBdev2 00:17:18.430 BaseBdev3' 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:18.430 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.689 "name": "NewBaseBdev", 00:17:18.689 "aliases": [ 00:17:18.689 "a34ba42c-64a5-4f34-b75e-12aa00897c02" 00:17:18.689 ], 00:17:18.689 "product_name": "Malloc disk", 00:17:18.689 "block_size": 512, 00:17:18.689 "num_blocks": 65536, 00:17:18.689 "uuid": "a34ba42c-64a5-4f34-b75e-12aa00897c02", 00:17:18.689 "assigned_rate_limits": { 00:17:18.689 "rw_ios_per_sec": 0, 00:17:18.689 "rw_mbytes_per_sec": 0, 00:17:18.689 "r_mbytes_per_sec": 0, 00:17:18.689 "w_mbytes_per_sec": 0 00:17:18.689 }, 00:17:18.689 "claimed": true, 00:17:18.689 "claim_type": "exclusive_write", 00:17:18.689 "zoned": false, 00:17:18.689 "supported_io_types": { 00:17:18.689 "read": true, 00:17:18.689 "write": true, 00:17:18.689 "unmap": true, 00:17:18.689 "flush": true, 00:17:18.689 "reset": true, 00:17:18.689 "nvme_admin": false, 00:17:18.689 "nvme_io": false, 00:17:18.689 "nvme_io_md": false, 00:17:18.689 "write_zeroes": true, 00:17:18.689 "zcopy": true, 00:17:18.689 "get_zone_info": false, 00:17:18.689 "zone_management": false, 00:17:18.689 "zone_append": false, 00:17:18.689 "compare": false, 00:17:18.689 "compare_and_write": false, 00:17:18.689 "abort": true, 00:17:18.689 "seek_hole": false, 00:17:18.689 "seek_data": false, 00:17:18.689 "copy": true, 00:17:18.689 "nvme_iov_md": false 00:17:18.689 }, 00:17:18.689 "memory_domains": [ 00:17:18.689 { 00:17:18.689 "dma_device_id": "system", 00:17:18.689 "dma_device_type": 1 00:17:18.689 }, 00:17:18.689 { 00:17:18.689 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.689 "dma_device_type": 2 00:17:18.689 } 00:17:18.689 ], 00:17:18.689 "driver_specific": {} 00:17:18.689 }' 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.689 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.690 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:18.690 22:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:18.948 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:19.207 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:19.207 "name": "BaseBdev2", 00:17:19.207 "aliases": [ 00:17:19.207 "ff2656b9-abba-4982-8426-e142febd7056" 00:17:19.207 ], 00:17:19.207 "product_name": "Malloc disk", 00:17:19.208 "block_size": 512, 00:17:19.208 "num_blocks": 65536, 00:17:19.208 "uuid": "ff2656b9-abba-4982-8426-e142febd7056", 00:17:19.208 "assigned_rate_limits": { 00:17:19.208 "rw_ios_per_sec": 0, 00:17:19.208 "rw_mbytes_per_sec": 0, 00:17:19.208 "r_mbytes_per_sec": 0, 00:17:19.208 "w_mbytes_per_sec": 0 00:17:19.208 }, 00:17:19.208 "claimed": true, 00:17:19.208 "claim_type": "exclusive_write", 00:17:19.208 "zoned": false, 00:17:19.208 "supported_io_types": { 00:17:19.208 "read": true, 00:17:19.208 "write": true, 00:17:19.208 "unmap": true, 00:17:19.208 "flush": true, 00:17:19.208 "reset": true, 00:17:19.208 "nvme_admin": false, 00:17:19.208 "nvme_io": false, 00:17:19.208 "nvme_io_md": false, 00:17:19.208 "write_zeroes": true, 00:17:19.208 "zcopy": true, 00:17:19.208 "get_zone_info": false, 00:17:19.208 "zone_management": false, 00:17:19.208 "zone_append": false, 00:17:19.208 "compare": false, 00:17:19.208 "compare_and_write": false, 00:17:19.208 "abort": true, 00:17:19.208 "seek_hole": false, 00:17:19.208 "seek_data": false, 00:17:19.208 "copy": true, 00:17:19.208 "nvme_iov_md": false 00:17:19.208 }, 00:17:19.208 "memory_domains": [ 00:17:19.208 { 00:17:19.208 "dma_device_id": "system", 00:17:19.208 "dma_device_type": 1 00:17:19.208 }, 00:17:19.208 { 00:17:19.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.208 "dma_device_type": 2 00:17:19.208 } 00:17:19.208 ], 00:17:19.208 "driver_specific": {} 00:17:19.208 }' 00:17:19.208 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.208 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.208 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.208 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.208 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:19.467 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:19.726 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:19.726 "name": "BaseBdev3", 00:17:19.726 "aliases": [ 00:17:19.726 "6d0a36f8-75c6-4858-bb27-1adeaaabe197" 00:17:19.726 ], 00:17:19.726 "product_name": "Malloc disk", 00:17:19.726 "block_size": 512, 00:17:19.726 "num_blocks": 65536, 00:17:19.726 "uuid": "6d0a36f8-75c6-4858-bb27-1adeaaabe197", 00:17:19.726 "assigned_rate_limits": { 00:17:19.726 "rw_ios_per_sec": 0, 00:17:19.726 "rw_mbytes_per_sec": 0, 00:17:19.726 "r_mbytes_per_sec": 0, 00:17:19.726 "w_mbytes_per_sec": 0 00:17:19.726 }, 00:17:19.726 "claimed": true, 00:17:19.726 "claim_type": "exclusive_write", 00:17:19.726 "zoned": false, 00:17:19.726 "supported_io_types": { 00:17:19.726 "read": true, 00:17:19.726 "write": true, 00:17:19.726 "unmap": true, 00:17:19.726 "flush": true, 00:17:19.726 "reset": true, 00:17:19.726 "nvme_admin": false, 00:17:19.726 "nvme_io": false, 00:17:19.726 "nvme_io_md": false, 00:17:19.726 "write_zeroes": true, 00:17:19.726 "zcopy": true, 00:17:19.726 "get_zone_info": false, 00:17:19.726 "zone_management": false, 00:17:19.726 "zone_append": false, 00:17:19.726 "compare": false, 00:17:19.726 "compare_and_write": false, 00:17:19.727 "abort": true, 00:17:19.727 "seek_hole": false, 00:17:19.727 "seek_data": false, 00:17:19.727 "copy": true, 00:17:19.727 "nvme_iov_md": false 00:17:19.727 }, 00:17:19.727 "memory_domains": [ 00:17:19.727 { 00:17:19.727 "dma_device_id": "system", 00:17:19.727 "dma_device_type": 1 00:17:19.727 }, 00:17:19.727 { 00:17:19.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.727 "dma_device_type": 2 00:17:19.727 } 00:17:19.727 ], 00:17:19.727 "driver_specific": {} 00:17:19.727 }' 00:17:19.727 22:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.727 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.986 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.245 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:20.245 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:20.245 [2024-07-12 22:24:30.552718] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:20.245 [2024-07-12 22:24:30.552747] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:20.245 [2024-07-12 22:24:30.552810] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:20.245 [2024-07-12 22:24:30.553095] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:20.245 [2024-07-12 22:24:30.553109] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacfe40 name Existed_Raid, state offline 00:17:20.245 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3467457 00:17:20.245 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3467457 ']' 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3467457 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3467457 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3467457' 00:17:20.504 killing process with pid 3467457 00:17:20.504 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3467457 00:17:20.505 [2024-07-12 22:24:30.619799] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:20.505 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3467457 00:17:20.505 [2024-07-12 22:24:30.650872] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:20.769 00:17:20.769 real 0m27.984s 00:17:20.769 user 0m51.481s 00:17:20.769 sys 0m4.911s 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.769 ************************************ 00:17:20.769 END TEST raid_state_function_test 00:17:20.769 ************************************ 00:17:20.769 22:24:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:20.769 22:24:30 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:20.769 22:24:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:20.769 22:24:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:20.769 22:24:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:20.769 ************************************ 00:17:20.769 START TEST raid_state_function_test_sb 00:17:20.769 ************************************ 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3472118 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3472118' 00:17:20.769 Process raid pid: 3472118 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3472118 /var/tmp/spdk-raid.sock 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3472118 ']' 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:20.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:20.769 22:24:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.769 [2024-07-12 22:24:31.023304] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:17:20.769 [2024-07-12 22:24:31.023367] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:21.029 [2024-07-12 22:24:31.152111] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:21.029 [2024-07-12 22:24:31.257806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:21.029 [2024-07-12 22:24:31.323778] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:21.029 [2024-07-12 22:24:31.323813] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:21.967 22:24:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:21.967 22:24:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:21.967 22:24:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:21.967 [2024-07-12 22:24:32.183020] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:21.967 [2024-07-12 22:24:32.183063] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:21.967 [2024-07-12 22:24:32.183074] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:21.967 [2024-07-12 22:24:32.183086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:21.967 [2024-07-12 22:24:32.183095] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:21.967 [2024-07-12 22:24:32.183106] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.967 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.226 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.227 "name": "Existed_Raid", 00:17:22.227 "uuid": "eb19be6b-cdfa-456f-9a81-388411f32642", 00:17:22.227 "strip_size_kb": 0, 00:17:22.227 "state": "configuring", 00:17:22.227 "raid_level": "raid1", 00:17:22.227 "superblock": true, 00:17:22.227 "num_base_bdevs": 3, 00:17:22.227 "num_base_bdevs_discovered": 0, 00:17:22.227 "num_base_bdevs_operational": 3, 00:17:22.227 "base_bdevs_list": [ 00:17:22.227 { 00:17:22.227 "name": "BaseBdev1", 00:17:22.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.227 "is_configured": false, 00:17:22.227 "data_offset": 0, 00:17:22.227 "data_size": 0 00:17:22.227 }, 00:17:22.227 { 00:17:22.227 "name": "BaseBdev2", 00:17:22.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.227 "is_configured": false, 00:17:22.227 "data_offset": 0, 00:17:22.227 "data_size": 0 00:17:22.227 }, 00:17:22.227 { 00:17:22.227 "name": "BaseBdev3", 00:17:22.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.227 "is_configured": false, 00:17:22.227 "data_offset": 0, 00:17:22.227 "data_size": 0 00:17:22.227 } 00:17:22.227 ] 00:17:22.227 }' 00:17:22.227 22:24:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.227 22:24:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.794 22:24:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:23.108 [2024-07-12 22:24:33.249706] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:23.108 [2024-07-12 22:24:33.249738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2650a80 name Existed_Raid, state configuring 00:17:23.108 22:24:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:23.367 [2024-07-12 22:24:33.498380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:23.367 [2024-07-12 22:24:33.498413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:23.367 [2024-07-12 22:24:33.498423] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:23.367 [2024-07-12 22:24:33.498435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:23.368 [2024-07-12 22:24:33.498443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:23.368 [2024-07-12 22:24:33.498454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:23.368 22:24:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:23.627 [2024-07-12 22:24:33.756931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:23.627 BaseBdev1 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:23.627 22:24:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.886 22:24:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:24.146 [ 00:17:24.146 { 00:17:24.146 "name": "BaseBdev1", 00:17:24.146 "aliases": [ 00:17:24.146 "7398813b-e724-40a0-9691-fe11a1d39e77" 00:17:24.146 ], 00:17:24.146 "product_name": "Malloc disk", 00:17:24.146 "block_size": 512, 00:17:24.146 "num_blocks": 65536, 00:17:24.146 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:24.146 "assigned_rate_limits": { 00:17:24.146 "rw_ios_per_sec": 0, 00:17:24.146 "rw_mbytes_per_sec": 0, 00:17:24.146 "r_mbytes_per_sec": 0, 00:17:24.146 "w_mbytes_per_sec": 0 00:17:24.146 }, 00:17:24.146 "claimed": true, 00:17:24.146 "claim_type": "exclusive_write", 00:17:24.146 "zoned": false, 00:17:24.146 "supported_io_types": { 00:17:24.146 "read": true, 00:17:24.146 "write": true, 00:17:24.146 "unmap": true, 00:17:24.146 "flush": true, 00:17:24.146 "reset": true, 00:17:24.146 "nvme_admin": false, 00:17:24.146 "nvme_io": false, 00:17:24.146 "nvme_io_md": false, 00:17:24.146 "write_zeroes": true, 00:17:24.146 "zcopy": true, 00:17:24.146 "get_zone_info": false, 00:17:24.146 "zone_management": false, 00:17:24.146 "zone_append": false, 00:17:24.146 "compare": false, 00:17:24.146 "compare_and_write": false, 00:17:24.146 "abort": true, 00:17:24.146 "seek_hole": false, 00:17:24.146 "seek_data": false, 00:17:24.146 "copy": true, 00:17:24.146 "nvme_iov_md": false 00:17:24.146 }, 00:17:24.146 "memory_domains": [ 00:17:24.146 { 00:17:24.146 "dma_device_id": "system", 00:17:24.146 "dma_device_type": 1 00:17:24.146 }, 00:17:24.146 { 00:17:24.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.146 "dma_device_type": 2 00:17:24.146 } 00:17:24.146 ], 00:17:24.146 "driver_specific": {} 00:17:24.146 } 00:17:24.146 ] 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.146 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.405 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.405 "name": "Existed_Raid", 00:17:24.406 "uuid": "8f825584-d984-4322-bce5-620ec410574b", 00:17:24.406 "strip_size_kb": 0, 00:17:24.406 "state": "configuring", 00:17:24.406 "raid_level": "raid1", 00:17:24.406 "superblock": true, 00:17:24.406 "num_base_bdevs": 3, 00:17:24.406 "num_base_bdevs_discovered": 1, 00:17:24.406 "num_base_bdevs_operational": 3, 00:17:24.406 "base_bdevs_list": [ 00:17:24.406 { 00:17:24.406 "name": "BaseBdev1", 00:17:24.406 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:24.406 "is_configured": true, 00:17:24.406 "data_offset": 2048, 00:17:24.406 "data_size": 63488 00:17:24.406 }, 00:17:24.406 { 00:17:24.406 "name": "BaseBdev2", 00:17:24.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.406 "is_configured": false, 00:17:24.406 "data_offset": 0, 00:17:24.406 "data_size": 0 00:17:24.406 }, 00:17:24.406 { 00:17:24.406 "name": "BaseBdev3", 00:17:24.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.406 "is_configured": false, 00:17:24.406 "data_offset": 0, 00:17:24.406 "data_size": 0 00:17:24.406 } 00:17:24.406 ] 00:17:24.406 }' 00:17:24.406 22:24:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.406 22:24:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.974 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:25.233 [2024-07-12 22:24:35.325078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:25.233 [2024-07-12 22:24:35.325124] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2650310 name Existed_Raid, state configuring 00:17:25.233 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:25.493 [2024-07-12 22:24:35.569765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:25.493 [2024-07-12 22:24:35.571267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:25.493 [2024-07-12 22:24:35.571300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:25.493 [2024-07-12 22:24:35.571311] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:25.493 [2024-07-12 22:24:35.571323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.493 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.753 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.753 "name": "Existed_Raid", 00:17:25.753 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:25.753 "strip_size_kb": 0, 00:17:25.753 "state": "configuring", 00:17:25.753 "raid_level": "raid1", 00:17:25.753 "superblock": true, 00:17:25.753 "num_base_bdevs": 3, 00:17:25.753 "num_base_bdevs_discovered": 1, 00:17:25.753 "num_base_bdevs_operational": 3, 00:17:25.753 "base_bdevs_list": [ 00:17:25.753 { 00:17:25.753 "name": "BaseBdev1", 00:17:25.753 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:25.753 "is_configured": true, 00:17:25.753 "data_offset": 2048, 00:17:25.753 "data_size": 63488 00:17:25.753 }, 00:17:25.753 { 00:17:25.753 "name": "BaseBdev2", 00:17:25.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.753 "is_configured": false, 00:17:25.753 "data_offset": 0, 00:17:25.753 "data_size": 0 00:17:25.753 }, 00:17:25.753 { 00:17:25.753 "name": "BaseBdev3", 00:17:25.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.753 "is_configured": false, 00:17:25.753 "data_offset": 0, 00:17:25.753 "data_size": 0 00:17:25.753 } 00:17:25.753 ] 00:17:25.753 }' 00:17:25.753 22:24:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.753 22:24:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:26.322 [2024-07-12 22:24:36.571960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:26.322 BaseBdev2 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:26.322 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:26.581 22:24:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:26.840 [ 00:17:26.841 { 00:17:26.841 "name": "BaseBdev2", 00:17:26.841 "aliases": [ 00:17:26.841 "919c3349-c748-4104-b194-d4d1200cd340" 00:17:26.841 ], 00:17:26.841 "product_name": "Malloc disk", 00:17:26.841 "block_size": 512, 00:17:26.841 "num_blocks": 65536, 00:17:26.841 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:26.841 "assigned_rate_limits": { 00:17:26.841 "rw_ios_per_sec": 0, 00:17:26.841 "rw_mbytes_per_sec": 0, 00:17:26.841 "r_mbytes_per_sec": 0, 00:17:26.841 "w_mbytes_per_sec": 0 00:17:26.841 }, 00:17:26.841 "claimed": true, 00:17:26.841 "claim_type": "exclusive_write", 00:17:26.841 "zoned": false, 00:17:26.841 "supported_io_types": { 00:17:26.841 "read": true, 00:17:26.841 "write": true, 00:17:26.841 "unmap": true, 00:17:26.841 "flush": true, 00:17:26.841 "reset": true, 00:17:26.841 "nvme_admin": false, 00:17:26.841 "nvme_io": false, 00:17:26.841 "nvme_io_md": false, 00:17:26.841 "write_zeroes": true, 00:17:26.841 "zcopy": true, 00:17:26.841 "get_zone_info": false, 00:17:26.841 "zone_management": false, 00:17:26.841 "zone_append": false, 00:17:26.841 "compare": false, 00:17:26.841 "compare_and_write": false, 00:17:26.841 "abort": true, 00:17:26.841 "seek_hole": false, 00:17:26.841 "seek_data": false, 00:17:26.841 "copy": true, 00:17:26.841 "nvme_iov_md": false 00:17:26.841 }, 00:17:26.841 "memory_domains": [ 00:17:26.841 { 00:17:26.841 "dma_device_id": "system", 00:17:26.841 "dma_device_type": 1 00:17:26.841 }, 00:17:26.841 { 00:17:26.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.841 "dma_device_type": 2 00:17:26.841 } 00:17:26.841 ], 00:17:26.841 "driver_specific": {} 00:17:26.841 } 00:17:26.841 ] 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.841 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.101 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.101 "name": "Existed_Raid", 00:17:27.101 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:27.101 "strip_size_kb": 0, 00:17:27.101 "state": "configuring", 00:17:27.101 "raid_level": "raid1", 00:17:27.101 "superblock": true, 00:17:27.101 "num_base_bdevs": 3, 00:17:27.101 "num_base_bdevs_discovered": 2, 00:17:27.101 "num_base_bdevs_operational": 3, 00:17:27.101 "base_bdevs_list": [ 00:17:27.101 { 00:17:27.101 "name": "BaseBdev1", 00:17:27.101 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:27.101 "is_configured": true, 00:17:27.101 "data_offset": 2048, 00:17:27.101 "data_size": 63488 00:17:27.101 }, 00:17:27.101 { 00:17:27.101 "name": "BaseBdev2", 00:17:27.101 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:27.101 "is_configured": true, 00:17:27.101 "data_offset": 2048, 00:17:27.101 "data_size": 63488 00:17:27.101 }, 00:17:27.101 { 00:17:27.101 "name": "BaseBdev3", 00:17:27.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.101 "is_configured": false, 00:17:27.101 "data_offset": 0, 00:17:27.101 "data_size": 0 00:17:27.101 } 00:17:27.101 ] 00:17:27.101 }' 00:17:27.101 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.101 22:24:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.669 22:24:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:27.926 [2024-07-12 22:24:38.083422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:27.926 [2024-07-12 22:24:38.083583] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2651400 00:17:27.926 [2024-07-12 22:24:38.083597] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:27.926 [2024-07-12 22:24:38.083767] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2650ef0 00:17:27.926 [2024-07-12 22:24:38.083888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2651400 00:17:27.926 [2024-07-12 22:24:38.083898] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2651400 00:17:27.926 [2024-07-12 22:24:38.084006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:27.926 BaseBdev3 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:27.926 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.184 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:28.443 [ 00:17:28.443 { 00:17:28.443 "name": "BaseBdev3", 00:17:28.443 "aliases": [ 00:17:28.443 "94298d79-d6ec-46ba-8a42-0b8d703abafe" 00:17:28.443 ], 00:17:28.443 "product_name": "Malloc disk", 00:17:28.443 "block_size": 512, 00:17:28.443 "num_blocks": 65536, 00:17:28.443 "uuid": "94298d79-d6ec-46ba-8a42-0b8d703abafe", 00:17:28.443 "assigned_rate_limits": { 00:17:28.443 "rw_ios_per_sec": 0, 00:17:28.443 "rw_mbytes_per_sec": 0, 00:17:28.443 "r_mbytes_per_sec": 0, 00:17:28.443 "w_mbytes_per_sec": 0 00:17:28.443 }, 00:17:28.443 "claimed": true, 00:17:28.443 "claim_type": "exclusive_write", 00:17:28.443 "zoned": false, 00:17:28.443 "supported_io_types": { 00:17:28.443 "read": true, 00:17:28.443 "write": true, 00:17:28.443 "unmap": true, 00:17:28.443 "flush": true, 00:17:28.443 "reset": true, 00:17:28.443 "nvme_admin": false, 00:17:28.443 "nvme_io": false, 00:17:28.443 "nvme_io_md": false, 00:17:28.443 "write_zeroes": true, 00:17:28.443 "zcopy": true, 00:17:28.443 "get_zone_info": false, 00:17:28.443 "zone_management": false, 00:17:28.443 "zone_append": false, 00:17:28.443 "compare": false, 00:17:28.443 "compare_and_write": false, 00:17:28.443 "abort": true, 00:17:28.443 "seek_hole": false, 00:17:28.443 "seek_data": false, 00:17:28.443 "copy": true, 00:17:28.443 "nvme_iov_md": false 00:17:28.443 }, 00:17:28.443 "memory_domains": [ 00:17:28.443 { 00:17:28.443 "dma_device_id": "system", 00:17:28.443 "dma_device_type": 1 00:17:28.443 }, 00:17:28.443 { 00:17:28.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.443 "dma_device_type": 2 00:17:28.443 } 00:17:28.443 ], 00:17:28.443 "driver_specific": {} 00:17:28.443 } 00:17:28.443 ] 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.443 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.702 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.703 "name": "Existed_Raid", 00:17:28.703 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:28.703 "strip_size_kb": 0, 00:17:28.703 "state": "online", 00:17:28.703 "raid_level": "raid1", 00:17:28.703 "superblock": true, 00:17:28.703 "num_base_bdevs": 3, 00:17:28.703 "num_base_bdevs_discovered": 3, 00:17:28.703 "num_base_bdevs_operational": 3, 00:17:28.703 "base_bdevs_list": [ 00:17:28.703 { 00:17:28.703 "name": "BaseBdev1", 00:17:28.703 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:28.703 "is_configured": true, 00:17:28.703 "data_offset": 2048, 00:17:28.703 "data_size": 63488 00:17:28.703 }, 00:17:28.703 { 00:17:28.703 "name": "BaseBdev2", 00:17:28.703 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:28.703 "is_configured": true, 00:17:28.703 "data_offset": 2048, 00:17:28.703 "data_size": 63488 00:17:28.703 }, 00:17:28.703 { 00:17:28.703 "name": "BaseBdev3", 00:17:28.703 "uuid": "94298d79-d6ec-46ba-8a42-0b8d703abafe", 00:17:28.703 "is_configured": true, 00:17:28.703 "data_offset": 2048, 00:17:28.703 "data_size": 63488 00:17:28.703 } 00:17:28.703 ] 00:17:28.703 }' 00:17:28.703 22:24:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.703 22:24:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:29.270 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:29.529 [2024-07-12 22:24:39.659910] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:29.529 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:29.529 "name": "Existed_Raid", 00:17:29.529 "aliases": [ 00:17:29.529 "d2c80a70-ee64-461d-a67f-9ff2248adf59" 00:17:29.529 ], 00:17:29.529 "product_name": "Raid Volume", 00:17:29.529 "block_size": 512, 00:17:29.529 "num_blocks": 63488, 00:17:29.529 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:29.529 "assigned_rate_limits": { 00:17:29.529 "rw_ios_per_sec": 0, 00:17:29.529 "rw_mbytes_per_sec": 0, 00:17:29.529 "r_mbytes_per_sec": 0, 00:17:29.529 "w_mbytes_per_sec": 0 00:17:29.529 }, 00:17:29.529 "claimed": false, 00:17:29.529 "zoned": false, 00:17:29.529 "supported_io_types": { 00:17:29.529 "read": true, 00:17:29.529 "write": true, 00:17:29.529 "unmap": false, 00:17:29.529 "flush": false, 00:17:29.529 "reset": true, 00:17:29.529 "nvme_admin": false, 00:17:29.529 "nvme_io": false, 00:17:29.529 "nvme_io_md": false, 00:17:29.529 "write_zeroes": true, 00:17:29.529 "zcopy": false, 00:17:29.529 "get_zone_info": false, 00:17:29.529 "zone_management": false, 00:17:29.529 "zone_append": false, 00:17:29.529 "compare": false, 00:17:29.529 "compare_and_write": false, 00:17:29.529 "abort": false, 00:17:29.529 "seek_hole": false, 00:17:29.529 "seek_data": false, 00:17:29.529 "copy": false, 00:17:29.529 "nvme_iov_md": false 00:17:29.529 }, 00:17:29.529 "memory_domains": [ 00:17:29.529 { 00:17:29.529 "dma_device_id": "system", 00:17:29.529 "dma_device_type": 1 00:17:29.529 }, 00:17:29.529 { 00:17:29.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.529 "dma_device_type": 2 00:17:29.529 }, 00:17:29.529 { 00:17:29.529 "dma_device_id": "system", 00:17:29.529 "dma_device_type": 1 00:17:29.529 }, 00:17:29.530 { 00:17:29.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.530 "dma_device_type": 2 00:17:29.530 }, 00:17:29.530 { 00:17:29.530 "dma_device_id": "system", 00:17:29.530 "dma_device_type": 1 00:17:29.530 }, 00:17:29.530 { 00:17:29.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.530 "dma_device_type": 2 00:17:29.530 } 00:17:29.530 ], 00:17:29.530 "driver_specific": { 00:17:29.530 "raid": { 00:17:29.530 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:29.530 "strip_size_kb": 0, 00:17:29.530 "state": "online", 00:17:29.530 "raid_level": "raid1", 00:17:29.530 "superblock": true, 00:17:29.530 "num_base_bdevs": 3, 00:17:29.530 "num_base_bdevs_discovered": 3, 00:17:29.530 "num_base_bdevs_operational": 3, 00:17:29.530 "base_bdevs_list": [ 00:17:29.530 { 00:17:29.530 "name": "BaseBdev1", 00:17:29.530 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:29.530 "is_configured": true, 00:17:29.530 "data_offset": 2048, 00:17:29.530 "data_size": 63488 00:17:29.530 }, 00:17:29.530 { 00:17:29.530 "name": "BaseBdev2", 00:17:29.530 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:29.530 "is_configured": true, 00:17:29.530 "data_offset": 2048, 00:17:29.530 "data_size": 63488 00:17:29.530 }, 00:17:29.530 { 00:17:29.530 "name": "BaseBdev3", 00:17:29.530 "uuid": "94298d79-d6ec-46ba-8a42-0b8d703abafe", 00:17:29.530 "is_configured": true, 00:17:29.530 "data_offset": 2048, 00:17:29.530 "data_size": 63488 00:17:29.530 } 00:17:29.530 ] 00:17:29.530 } 00:17:29.530 } 00:17:29.530 }' 00:17:29.530 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:29.530 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:29.530 BaseBdev2 00:17:29.530 BaseBdev3' 00:17:29.530 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.530 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:29.530 22:24:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.097 "name": "BaseBdev1", 00:17:30.097 "aliases": [ 00:17:30.097 "7398813b-e724-40a0-9691-fe11a1d39e77" 00:17:30.097 ], 00:17:30.097 "product_name": "Malloc disk", 00:17:30.097 "block_size": 512, 00:17:30.097 "num_blocks": 65536, 00:17:30.097 "uuid": "7398813b-e724-40a0-9691-fe11a1d39e77", 00:17:30.097 "assigned_rate_limits": { 00:17:30.097 "rw_ios_per_sec": 0, 00:17:30.097 "rw_mbytes_per_sec": 0, 00:17:30.097 "r_mbytes_per_sec": 0, 00:17:30.097 "w_mbytes_per_sec": 0 00:17:30.097 }, 00:17:30.097 "claimed": true, 00:17:30.097 "claim_type": "exclusive_write", 00:17:30.097 "zoned": false, 00:17:30.097 "supported_io_types": { 00:17:30.097 "read": true, 00:17:30.097 "write": true, 00:17:30.097 "unmap": true, 00:17:30.097 "flush": true, 00:17:30.097 "reset": true, 00:17:30.097 "nvme_admin": false, 00:17:30.097 "nvme_io": false, 00:17:30.097 "nvme_io_md": false, 00:17:30.097 "write_zeroes": true, 00:17:30.097 "zcopy": true, 00:17:30.097 "get_zone_info": false, 00:17:30.097 "zone_management": false, 00:17:30.097 "zone_append": false, 00:17:30.097 "compare": false, 00:17:30.097 "compare_and_write": false, 00:17:30.097 "abort": true, 00:17:30.097 "seek_hole": false, 00:17:30.097 "seek_data": false, 00:17:30.097 "copy": true, 00:17:30.097 "nvme_iov_md": false 00:17:30.097 }, 00:17:30.097 "memory_domains": [ 00:17:30.097 { 00:17:30.097 "dma_device_id": "system", 00:17:30.097 "dma_device_type": 1 00:17:30.097 }, 00:17:30.097 { 00:17:30.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.097 "dma_device_type": 2 00:17:30.097 } 00:17:30.097 ], 00:17:30.097 "driver_specific": {} 00:17:30.097 }' 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.097 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.356 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:30.357 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.616 "name": "BaseBdev2", 00:17:30.616 "aliases": [ 00:17:30.616 "919c3349-c748-4104-b194-d4d1200cd340" 00:17:30.616 ], 00:17:30.616 "product_name": "Malloc disk", 00:17:30.616 "block_size": 512, 00:17:30.616 "num_blocks": 65536, 00:17:30.616 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:30.616 "assigned_rate_limits": { 00:17:30.616 "rw_ios_per_sec": 0, 00:17:30.616 "rw_mbytes_per_sec": 0, 00:17:30.616 "r_mbytes_per_sec": 0, 00:17:30.616 "w_mbytes_per_sec": 0 00:17:30.616 }, 00:17:30.616 "claimed": true, 00:17:30.616 "claim_type": "exclusive_write", 00:17:30.616 "zoned": false, 00:17:30.616 "supported_io_types": { 00:17:30.616 "read": true, 00:17:30.616 "write": true, 00:17:30.616 "unmap": true, 00:17:30.616 "flush": true, 00:17:30.616 "reset": true, 00:17:30.616 "nvme_admin": false, 00:17:30.616 "nvme_io": false, 00:17:30.616 "nvme_io_md": false, 00:17:30.616 "write_zeroes": true, 00:17:30.616 "zcopy": true, 00:17:30.616 "get_zone_info": false, 00:17:30.616 "zone_management": false, 00:17:30.616 "zone_append": false, 00:17:30.616 "compare": false, 00:17:30.616 "compare_and_write": false, 00:17:30.616 "abort": true, 00:17:30.616 "seek_hole": false, 00:17:30.616 "seek_data": false, 00:17:30.616 "copy": true, 00:17:30.616 "nvme_iov_md": false 00:17:30.616 }, 00:17:30.616 "memory_domains": [ 00:17:30.616 { 00:17:30.616 "dma_device_id": "system", 00:17:30.616 "dma_device_type": 1 00:17:30.616 }, 00:17:30.616 { 00:17:30.616 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.616 "dma_device_type": 2 00:17:30.616 } 00:17:30.616 ], 00:17:30.616 "driver_specific": {} 00:17:30.616 }' 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.616 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.875 22:24:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:30.875 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.134 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.134 "name": "BaseBdev3", 00:17:31.134 "aliases": [ 00:17:31.134 "94298d79-d6ec-46ba-8a42-0b8d703abafe" 00:17:31.134 ], 00:17:31.134 "product_name": "Malloc disk", 00:17:31.134 "block_size": 512, 00:17:31.134 "num_blocks": 65536, 00:17:31.134 "uuid": "94298d79-d6ec-46ba-8a42-0b8d703abafe", 00:17:31.134 "assigned_rate_limits": { 00:17:31.134 "rw_ios_per_sec": 0, 00:17:31.134 "rw_mbytes_per_sec": 0, 00:17:31.134 "r_mbytes_per_sec": 0, 00:17:31.134 "w_mbytes_per_sec": 0 00:17:31.134 }, 00:17:31.134 "claimed": true, 00:17:31.134 "claim_type": "exclusive_write", 00:17:31.134 "zoned": false, 00:17:31.134 "supported_io_types": { 00:17:31.134 "read": true, 00:17:31.134 "write": true, 00:17:31.134 "unmap": true, 00:17:31.134 "flush": true, 00:17:31.135 "reset": true, 00:17:31.135 "nvme_admin": false, 00:17:31.135 "nvme_io": false, 00:17:31.135 "nvme_io_md": false, 00:17:31.135 "write_zeroes": true, 00:17:31.135 "zcopy": true, 00:17:31.135 "get_zone_info": false, 00:17:31.135 "zone_management": false, 00:17:31.135 "zone_append": false, 00:17:31.135 "compare": false, 00:17:31.135 "compare_and_write": false, 00:17:31.135 "abort": true, 00:17:31.135 "seek_hole": false, 00:17:31.135 "seek_data": false, 00:17:31.135 "copy": true, 00:17:31.135 "nvme_iov_md": false 00:17:31.135 }, 00:17:31.135 "memory_domains": [ 00:17:31.135 { 00:17:31.135 "dma_device_id": "system", 00:17:31.135 "dma_device_type": 1 00:17:31.135 }, 00:17:31.135 { 00:17:31.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.135 "dma_device_type": 2 00:17:31.135 } 00:17:31.135 ], 00:17:31.135 "driver_specific": {} 00:17:31.135 }' 00:17:31.135 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.135 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.135 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.135 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.135 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.394 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:31.654 [2024-07-12 22:24:41.849491] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.654 22:24:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.913 22:24:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.913 "name": "Existed_Raid", 00:17:31.913 "uuid": "d2c80a70-ee64-461d-a67f-9ff2248adf59", 00:17:31.913 "strip_size_kb": 0, 00:17:31.913 "state": "online", 00:17:31.913 "raid_level": "raid1", 00:17:31.913 "superblock": true, 00:17:31.913 "num_base_bdevs": 3, 00:17:31.913 "num_base_bdevs_discovered": 2, 00:17:31.913 "num_base_bdevs_operational": 2, 00:17:31.913 "base_bdevs_list": [ 00:17:31.913 { 00:17:31.913 "name": null, 00:17:31.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.913 "is_configured": false, 00:17:31.913 "data_offset": 2048, 00:17:31.913 "data_size": 63488 00:17:31.913 }, 00:17:31.913 { 00:17:31.913 "name": "BaseBdev2", 00:17:31.913 "uuid": "919c3349-c748-4104-b194-d4d1200cd340", 00:17:31.913 "is_configured": true, 00:17:31.913 "data_offset": 2048, 00:17:31.913 "data_size": 63488 00:17:31.913 }, 00:17:31.913 { 00:17:31.913 "name": "BaseBdev3", 00:17:31.913 "uuid": "94298d79-d6ec-46ba-8a42-0b8d703abafe", 00:17:31.913 "is_configured": true, 00:17:31.913 "data_offset": 2048, 00:17:31.913 "data_size": 63488 00:17:31.913 } 00:17:31.913 ] 00:17:31.913 }' 00:17:31.913 22:24:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.913 22:24:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:32.851 22:24:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:32.851 22:24:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:32.851 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:32.851 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.111 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:33.111 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:33.111 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:33.111 [2024-07-12 22:24:43.402702] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:33.370 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:33.370 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:33.370 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.370 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:33.629 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:33.629 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:33.629 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:33.629 [2024-07-12 22:24:43.922692] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:33.629 [2024-07-12 22:24:43.922783] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:33.629 [2024-07-12 22:24:43.935509] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:33.629 [2024-07-12 22:24:43.935546] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:33.629 [2024-07-12 22:24:43.935558] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2651400 name Existed_Raid, state offline 00:17:33.889 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:33.889 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:33.889 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.889 22:24:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:33.889 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:34.148 BaseBdev2 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.148 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.407 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:34.666 [ 00:17:34.666 { 00:17:34.666 "name": "BaseBdev2", 00:17:34.666 "aliases": [ 00:17:34.666 "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441" 00:17:34.666 ], 00:17:34.666 "product_name": "Malloc disk", 00:17:34.666 "block_size": 512, 00:17:34.666 "num_blocks": 65536, 00:17:34.666 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:34.666 "assigned_rate_limits": { 00:17:34.666 "rw_ios_per_sec": 0, 00:17:34.666 "rw_mbytes_per_sec": 0, 00:17:34.666 "r_mbytes_per_sec": 0, 00:17:34.666 "w_mbytes_per_sec": 0 00:17:34.666 }, 00:17:34.666 "claimed": false, 00:17:34.666 "zoned": false, 00:17:34.666 "supported_io_types": { 00:17:34.666 "read": true, 00:17:34.666 "write": true, 00:17:34.666 "unmap": true, 00:17:34.666 "flush": true, 00:17:34.666 "reset": true, 00:17:34.666 "nvme_admin": false, 00:17:34.666 "nvme_io": false, 00:17:34.666 "nvme_io_md": false, 00:17:34.666 "write_zeroes": true, 00:17:34.666 "zcopy": true, 00:17:34.666 "get_zone_info": false, 00:17:34.666 "zone_management": false, 00:17:34.666 "zone_append": false, 00:17:34.666 "compare": false, 00:17:34.666 "compare_and_write": false, 00:17:34.666 "abort": true, 00:17:34.666 "seek_hole": false, 00:17:34.666 "seek_data": false, 00:17:34.666 "copy": true, 00:17:34.666 "nvme_iov_md": false 00:17:34.666 }, 00:17:34.666 "memory_domains": [ 00:17:34.666 { 00:17:34.666 "dma_device_id": "system", 00:17:34.666 "dma_device_type": 1 00:17:34.666 }, 00:17:34.666 { 00:17:34.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.666 "dma_device_type": 2 00:17:34.666 } 00:17:34.666 ], 00:17:34.666 "driver_specific": {} 00:17:34.666 } 00:17:34.666 ] 00:17:34.666 22:24:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:34.666 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:34.666 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:34.666 22:24:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:34.925 BaseBdev3 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.925 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.185 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:35.444 [ 00:17:35.444 { 00:17:35.444 "name": "BaseBdev3", 00:17:35.444 "aliases": [ 00:17:35.444 "6b3834f6-5ec9-466e-80b3-0b3939b0425c" 00:17:35.444 ], 00:17:35.444 "product_name": "Malloc disk", 00:17:35.444 "block_size": 512, 00:17:35.444 "num_blocks": 65536, 00:17:35.445 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:35.445 "assigned_rate_limits": { 00:17:35.445 "rw_ios_per_sec": 0, 00:17:35.445 "rw_mbytes_per_sec": 0, 00:17:35.445 "r_mbytes_per_sec": 0, 00:17:35.445 "w_mbytes_per_sec": 0 00:17:35.445 }, 00:17:35.445 "claimed": false, 00:17:35.445 "zoned": false, 00:17:35.445 "supported_io_types": { 00:17:35.445 "read": true, 00:17:35.445 "write": true, 00:17:35.445 "unmap": true, 00:17:35.445 "flush": true, 00:17:35.445 "reset": true, 00:17:35.445 "nvme_admin": false, 00:17:35.445 "nvme_io": false, 00:17:35.445 "nvme_io_md": false, 00:17:35.445 "write_zeroes": true, 00:17:35.445 "zcopy": true, 00:17:35.445 "get_zone_info": false, 00:17:35.445 "zone_management": false, 00:17:35.445 "zone_append": false, 00:17:35.445 "compare": false, 00:17:35.445 "compare_and_write": false, 00:17:35.445 "abort": true, 00:17:35.445 "seek_hole": false, 00:17:35.445 "seek_data": false, 00:17:35.445 "copy": true, 00:17:35.445 "nvme_iov_md": false 00:17:35.445 }, 00:17:35.445 "memory_domains": [ 00:17:35.445 { 00:17:35.445 "dma_device_id": "system", 00:17:35.445 "dma_device_type": 1 00:17:35.445 }, 00:17:35.445 { 00:17:35.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.445 "dma_device_type": 2 00:17:35.445 } 00:17:35.445 ], 00:17:35.445 "driver_specific": {} 00:17:35.445 } 00:17:35.445 ] 00:17:35.445 22:24:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:35.445 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:35.445 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.445 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:35.704 [2024-07-12 22:24:45.885225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:35.704 [2024-07-12 22:24:45.885265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:35.704 [2024-07-12 22:24:45.885285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:35.704 [2024-07-12 22:24:45.886654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.704 22:24:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.964 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.964 "name": "Existed_Raid", 00:17:35.964 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:35.964 "strip_size_kb": 0, 00:17:35.964 "state": "configuring", 00:17:35.964 "raid_level": "raid1", 00:17:35.964 "superblock": true, 00:17:35.964 "num_base_bdevs": 3, 00:17:35.964 "num_base_bdevs_discovered": 2, 00:17:35.964 "num_base_bdevs_operational": 3, 00:17:35.964 "base_bdevs_list": [ 00:17:35.964 { 00:17:35.964 "name": "BaseBdev1", 00:17:35.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.964 "is_configured": false, 00:17:35.964 "data_offset": 0, 00:17:35.964 "data_size": 0 00:17:35.964 }, 00:17:35.964 { 00:17:35.964 "name": "BaseBdev2", 00:17:35.964 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:35.964 "is_configured": true, 00:17:35.964 "data_offset": 2048, 00:17:35.964 "data_size": 63488 00:17:35.964 }, 00:17:35.964 { 00:17:35.964 "name": "BaseBdev3", 00:17:35.964 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:35.964 "is_configured": true, 00:17:35.964 "data_offset": 2048, 00:17:35.964 "data_size": 63488 00:17:35.964 } 00:17:35.964 ] 00:17:35.964 }' 00:17:35.964 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.964 22:24:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.533 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:36.792 [2024-07-12 22:24:46.903888] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.792 22:24:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.051 22:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.051 "name": "Existed_Raid", 00:17:37.051 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:37.051 "strip_size_kb": 0, 00:17:37.051 "state": "configuring", 00:17:37.051 "raid_level": "raid1", 00:17:37.051 "superblock": true, 00:17:37.051 "num_base_bdevs": 3, 00:17:37.051 "num_base_bdevs_discovered": 1, 00:17:37.051 "num_base_bdevs_operational": 3, 00:17:37.051 "base_bdevs_list": [ 00:17:37.051 { 00:17:37.051 "name": "BaseBdev1", 00:17:37.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.051 "is_configured": false, 00:17:37.051 "data_offset": 0, 00:17:37.051 "data_size": 0 00:17:37.051 }, 00:17:37.051 { 00:17:37.051 "name": null, 00:17:37.051 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:37.051 "is_configured": false, 00:17:37.051 "data_offset": 2048, 00:17:37.051 "data_size": 63488 00:17:37.051 }, 00:17:37.051 { 00:17:37.051 "name": "BaseBdev3", 00:17:37.051 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:37.051 "is_configured": true, 00:17:37.051 "data_offset": 2048, 00:17:37.051 "data_size": 63488 00:17:37.051 } 00:17:37.051 ] 00:17:37.051 }' 00:17:37.051 22:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.051 22:24:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:37.620 22:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.620 22:24:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:37.879 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:37.879 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:38.139 [2024-07-12 22:24:48.252175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:38.139 BaseBdev1 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:38.139 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:38.398 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:38.657 [ 00:17:38.657 { 00:17:38.657 "name": "BaseBdev1", 00:17:38.657 "aliases": [ 00:17:38.657 "9a6b93e3-cc0a-458f-b921-64515c1e864b" 00:17:38.657 ], 00:17:38.657 "product_name": "Malloc disk", 00:17:38.657 "block_size": 512, 00:17:38.657 "num_blocks": 65536, 00:17:38.657 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:38.657 "assigned_rate_limits": { 00:17:38.657 "rw_ios_per_sec": 0, 00:17:38.657 "rw_mbytes_per_sec": 0, 00:17:38.657 "r_mbytes_per_sec": 0, 00:17:38.657 "w_mbytes_per_sec": 0 00:17:38.657 }, 00:17:38.657 "claimed": true, 00:17:38.657 "claim_type": "exclusive_write", 00:17:38.657 "zoned": false, 00:17:38.657 "supported_io_types": { 00:17:38.657 "read": true, 00:17:38.657 "write": true, 00:17:38.657 "unmap": true, 00:17:38.657 "flush": true, 00:17:38.657 "reset": true, 00:17:38.657 "nvme_admin": false, 00:17:38.657 "nvme_io": false, 00:17:38.657 "nvme_io_md": false, 00:17:38.657 "write_zeroes": true, 00:17:38.657 "zcopy": true, 00:17:38.657 "get_zone_info": false, 00:17:38.657 "zone_management": false, 00:17:38.657 "zone_append": false, 00:17:38.657 "compare": false, 00:17:38.657 "compare_and_write": false, 00:17:38.657 "abort": true, 00:17:38.657 "seek_hole": false, 00:17:38.657 "seek_data": false, 00:17:38.657 "copy": true, 00:17:38.657 "nvme_iov_md": false 00:17:38.657 }, 00:17:38.657 "memory_domains": [ 00:17:38.657 { 00:17:38.657 "dma_device_id": "system", 00:17:38.657 "dma_device_type": 1 00:17:38.657 }, 00:17:38.657 { 00:17:38.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.657 "dma_device_type": 2 00:17:38.657 } 00:17:38.657 ], 00:17:38.657 "driver_specific": {} 00:17:38.657 } 00:17:38.657 ] 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.657 22:24:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.916 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.916 "name": "Existed_Raid", 00:17:38.917 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:38.917 "strip_size_kb": 0, 00:17:38.917 "state": "configuring", 00:17:38.917 "raid_level": "raid1", 00:17:38.917 "superblock": true, 00:17:38.917 "num_base_bdevs": 3, 00:17:38.917 "num_base_bdevs_discovered": 2, 00:17:38.917 "num_base_bdevs_operational": 3, 00:17:38.917 "base_bdevs_list": [ 00:17:38.917 { 00:17:38.917 "name": "BaseBdev1", 00:17:38.917 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:38.917 "is_configured": true, 00:17:38.917 "data_offset": 2048, 00:17:38.917 "data_size": 63488 00:17:38.917 }, 00:17:38.917 { 00:17:38.917 "name": null, 00:17:38.917 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:38.917 "is_configured": false, 00:17:38.917 "data_offset": 2048, 00:17:38.917 "data_size": 63488 00:17:38.917 }, 00:17:38.917 { 00:17:38.917 "name": "BaseBdev3", 00:17:38.917 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:38.917 "is_configured": true, 00:17:38.917 "data_offset": 2048, 00:17:38.917 "data_size": 63488 00:17:38.917 } 00:17:38.917 ] 00:17:38.917 }' 00:17:38.917 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.917 22:24:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:39.485 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.485 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:39.744 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:39.744 22:24:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:40.002 [2024-07-12 22:24:50.101107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.003 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.262 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.262 "name": "Existed_Raid", 00:17:40.262 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:40.262 "strip_size_kb": 0, 00:17:40.262 "state": "configuring", 00:17:40.262 "raid_level": "raid1", 00:17:40.262 "superblock": true, 00:17:40.262 "num_base_bdevs": 3, 00:17:40.262 "num_base_bdevs_discovered": 1, 00:17:40.262 "num_base_bdevs_operational": 3, 00:17:40.262 "base_bdevs_list": [ 00:17:40.262 { 00:17:40.262 "name": "BaseBdev1", 00:17:40.262 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:40.262 "is_configured": true, 00:17:40.262 "data_offset": 2048, 00:17:40.262 "data_size": 63488 00:17:40.262 }, 00:17:40.262 { 00:17:40.262 "name": null, 00:17:40.262 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:40.262 "is_configured": false, 00:17:40.262 "data_offset": 2048, 00:17:40.262 "data_size": 63488 00:17:40.262 }, 00:17:40.262 { 00:17:40.262 "name": null, 00:17:40.262 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:40.262 "is_configured": false, 00:17:40.262 "data_offset": 2048, 00:17:40.262 "data_size": 63488 00:17:40.262 } 00:17:40.262 ] 00:17:40.262 }' 00:17:40.262 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.262 22:24:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:40.829 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:40.830 22:24:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.088 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:41.088 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:41.347 [2024-07-12 22:24:51.452710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.347 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.605 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.605 "name": "Existed_Raid", 00:17:41.605 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:41.605 "strip_size_kb": 0, 00:17:41.605 "state": "configuring", 00:17:41.605 "raid_level": "raid1", 00:17:41.605 "superblock": true, 00:17:41.605 "num_base_bdevs": 3, 00:17:41.605 "num_base_bdevs_discovered": 2, 00:17:41.605 "num_base_bdevs_operational": 3, 00:17:41.605 "base_bdevs_list": [ 00:17:41.605 { 00:17:41.605 "name": "BaseBdev1", 00:17:41.605 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:41.605 "is_configured": true, 00:17:41.605 "data_offset": 2048, 00:17:41.605 "data_size": 63488 00:17:41.605 }, 00:17:41.605 { 00:17:41.605 "name": null, 00:17:41.605 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:41.605 "is_configured": false, 00:17:41.605 "data_offset": 2048, 00:17:41.605 "data_size": 63488 00:17:41.605 }, 00:17:41.605 { 00:17:41.605 "name": "BaseBdev3", 00:17:41.605 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:41.605 "is_configured": true, 00:17:41.605 "data_offset": 2048, 00:17:41.605 "data_size": 63488 00:17:41.605 } 00:17:41.605 ] 00:17:41.605 }' 00:17:41.605 22:24:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.605 22:24:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.217 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.217 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:42.217 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:42.217 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:42.499 [2024-07-12 22:24:52.752183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.499 22:24:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.757 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.757 "name": "Existed_Raid", 00:17:42.757 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:42.757 "strip_size_kb": 0, 00:17:42.757 "state": "configuring", 00:17:42.757 "raid_level": "raid1", 00:17:42.757 "superblock": true, 00:17:42.757 "num_base_bdevs": 3, 00:17:42.757 "num_base_bdevs_discovered": 1, 00:17:42.757 "num_base_bdevs_operational": 3, 00:17:42.757 "base_bdevs_list": [ 00:17:42.757 { 00:17:42.757 "name": null, 00:17:42.757 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:42.757 "is_configured": false, 00:17:42.757 "data_offset": 2048, 00:17:42.757 "data_size": 63488 00:17:42.757 }, 00:17:42.757 { 00:17:42.757 "name": null, 00:17:42.757 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:42.757 "is_configured": false, 00:17:42.757 "data_offset": 2048, 00:17:42.757 "data_size": 63488 00:17:42.757 }, 00:17:42.757 { 00:17:42.757 "name": "BaseBdev3", 00:17:42.757 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:42.757 "is_configured": true, 00:17:42.757 "data_offset": 2048, 00:17:42.757 "data_size": 63488 00:17:42.757 } 00:17:42.757 ] 00:17:42.757 }' 00:17:42.757 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.757 22:24:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:43.324 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.324 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:43.583 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:43.583 22:24:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:43.849 [2024-07-12 22:24:54.026113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:43.849 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:43.849 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.849 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.849 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.849 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.850 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.108 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.108 "name": "Existed_Raid", 00:17:44.108 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:44.108 "strip_size_kb": 0, 00:17:44.108 "state": "configuring", 00:17:44.108 "raid_level": "raid1", 00:17:44.108 "superblock": true, 00:17:44.108 "num_base_bdevs": 3, 00:17:44.108 "num_base_bdevs_discovered": 2, 00:17:44.108 "num_base_bdevs_operational": 3, 00:17:44.108 "base_bdevs_list": [ 00:17:44.108 { 00:17:44.108 "name": null, 00:17:44.108 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:44.108 "is_configured": false, 00:17:44.108 "data_offset": 2048, 00:17:44.108 "data_size": 63488 00:17:44.108 }, 00:17:44.108 { 00:17:44.108 "name": "BaseBdev2", 00:17:44.108 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:44.108 "is_configured": true, 00:17:44.108 "data_offset": 2048, 00:17:44.108 "data_size": 63488 00:17:44.108 }, 00:17:44.108 { 00:17:44.108 "name": "BaseBdev3", 00:17:44.108 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:44.108 "is_configured": true, 00:17:44.108 "data_offset": 2048, 00:17:44.108 "data_size": 63488 00:17:44.108 } 00:17:44.108 ] 00:17:44.108 }' 00:17:44.108 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.108 22:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.676 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.676 22:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:44.935 22:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:44.935 22:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.935 22:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:45.195 22:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9a6b93e3-cc0a-458f-b921-64515c1e864b 00:17:45.454 [2024-07-12 22:24:55.582774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:45.454 [2024-07-12 22:24:55.582940] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26471b0 00:17:45.454 [2024-07-12 22:24:55.582954] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:45.454 [2024-07-12 22:24:55.583132] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28034f0 00:17:45.454 [2024-07-12 22:24:55.583252] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26471b0 00:17:45.454 [2024-07-12 22:24:55.583262] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26471b0 00:17:45.454 [2024-07-12 22:24:55.583358] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.454 NewBaseBdev 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:45.454 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.714 22:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:45.973 [ 00:17:45.973 { 00:17:45.973 "name": "NewBaseBdev", 00:17:45.973 "aliases": [ 00:17:45.973 "9a6b93e3-cc0a-458f-b921-64515c1e864b" 00:17:45.973 ], 00:17:45.973 "product_name": "Malloc disk", 00:17:45.973 "block_size": 512, 00:17:45.973 "num_blocks": 65536, 00:17:45.973 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:45.973 "assigned_rate_limits": { 00:17:45.973 "rw_ios_per_sec": 0, 00:17:45.973 "rw_mbytes_per_sec": 0, 00:17:45.973 "r_mbytes_per_sec": 0, 00:17:45.973 "w_mbytes_per_sec": 0 00:17:45.973 }, 00:17:45.973 "claimed": true, 00:17:45.973 "claim_type": "exclusive_write", 00:17:45.973 "zoned": false, 00:17:45.973 "supported_io_types": { 00:17:45.973 "read": true, 00:17:45.973 "write": true, 00:17:45.973 "unmap": true, 00:17:45.973 "flush": true, 00:17:45.973 "reset": true, 00:17:45.973 "nvme_admin": false, 00:17:45.973 "nvme_io": false, 00:17:45.973 "nvme_io_md": false, 00:17:45.973 "write_zeroes": true, 00:17:45.973 "zcopy": true, 00:17:45.973 "get_zone_info": false, 00:17:45.973 "zone_management": false, 00:17:45.973 "zone_append": false, 00:17:45.973 "compare": false, 00:17:45.973 "compare_and_write": false, 00:17:45.973 "abort": true, 00:17:45.973 "seek_hole": false, 00:17:45.973 "seek_data": false, 00:17:45.973 "copy": true, 00:17:45.973 "nvme_iov_md": false 00:17:45.973 }, 00:17:45.973 "memory_domains": [ 00:17:45.973 { 00:17:45.973 "dma_device_id": "system", 00:17:45.973 "dma_device_type": 1 00:17:45.973 }, 00:17:45.973 { 00:17:45.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.973 "dma_device_type": 2 00:17:45.973 } 00:17:45.973 ], 00:17:45.973 "driver_specific": {} 00:17:45.973 } 00:17:45.973 ] 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.973 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.232 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.232 "name": "Existed_Raid", 00:17:46.232 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:46.232 "strip_size_kb": 0, 00:17:46.232 "state": "online", 00:17:46.232 "raid_level": "raid1", 00:17:46.232 "superblock": true, 00:17:46.232 "num_base_bdevs": 3, 00:17:46.232 "num_base_bdevs_discovered": 3, 00:17:46.232 "num_base_bdevs_operational": 3, 00:17:46.232 "base_bdevs_list": [ 00:17:46.232 { 00:17:46.232 "name": "NewBaseBdev", 00:17:46.232 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:46.232 "is_configured": true, 00:17:46.232 "data_offset": 2048, 00:17:46.232 "data_size": 63488 00:17:46.232 }, 00:17:46.232 { 00:17:46.232 "name": "BaseBdev2", 00:17:46.232 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:46.232 "is_configured": true, 00:17:46.232 "data_offset": 2048, 00:17:46.232 "data_size": 63488 00:17:46.232 }, 00:17:46.232 { 00:17:46.232 "name": "BaseBdev3", 00:17:46.232 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:46.232 "is_configured": true, 00:17:46.232 "data_offset": 2048, 00:17:46.232 "data_size": 63488 00:17:46.232 } 00:17:46.232 ] 00:17:46.232 }' 00:17:46.232 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.232 22:24:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:46.802 22:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:46.802 [2024-07-12 22:24:57.099096] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.802 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:46.802 "name": "Existed_Raid", 00:17:46.802 "aliases": [ 00:17:46.802 "839f2797-b980-4283-a34b-275c3cd43f10" 00:17:46.802 ], 00:17:46.802 "product_name": "Raid Volume", 00:17:46.802 "block_size": 512, 00:17:46.802 "num_blocks": 63488, 00:17:46.802 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:46.802 "assigned_rate_limits": { 00:17:46.802 "rw_ios_per_sec": 0, 00:17:46.802 "rw_mbytes_per_sec": 0, 00:17:46.802 "r_mbytes_per_sec": 0, 00:17:46.802 "w_mbytes_per_sec": 0 00:17:46.802 }, 00:17:46.802 "claimed": false, 00:17:46.802 "zoned": false, 00:17:46.802 "supported_io_types": { 00:17:46.802 "read": true, 00:17:46.802 "write": true, 00:17:46.802 "unmap": false, 00:17:46.802 "flush": false, 00:17:46.802 "reset": true, 00:17:46.802 "nvme_admin": false, 00:17:46.802 "nvme_io": false, 00:17:46.802 "nvme_io_md": false, 00:17:46.802 "write_zeroes": true, 00:17:46.802 "zcopy": false, 00:17:46.802 "get_zone_info": false, 00:17:46.802 "zone_management": false, 00:17:46.802 "zone_append": false, 00:17:46.802 "compare": false, 00:17:46.802 "compare_and_write": false, 00:17:46.802 "abort": false, 00:17:46.802 "seek_hole": false, 00:17:46.802 "seek_data": false, 00:17:46.802 "copy": false, 00:17:46.802 "nvme_iov_md": false 00:17:46.802 }, 00:17:46.802 "memory_domains": [ 00:17:46.802 { 00:17:46.802 "dma_device_id": "system", 00:17:46.802 "dma_device_type": 1 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.802 "dma_device_type": 2 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "dma_device_id": "system", 00:17:46.802 "dma_device_type": 1 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.802 "dma_device_type": 2 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "dma_device_id": "system", 00:17:46.802 "dma_device_type": 1 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.802 "dma_device_type": 2 00:17:46.802 } 00:17:46.802 ], 00:17:46.802 "driver_specific": { 00:17:46.802 "raid": { 00:17:46.802 "uuid": "839f2797-b980-4283-a34b-275c3cd43f10", 00:17:46.802 "strip_size_kb": 0, 00:17:46.802 "state": "online", 00:17:46.802 "raid_level": "raid1", 00:17:46.802 "superblock": true, 00:17:46.802 "num_base_bdevs": 3, 00:17:46.802 "num_base_bdevs_discovered": 3, 00:17:46.802 "num_base_bdevs_operational": 3, 00:17:46.802 "base_bdevs_list": [ 00:17:46.802 { 00:17:46.802 "name": "NewBaseBdev", 00:17:46.802 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:46.802 "is_configured": true, 00:17:46.802 "data_offset": 2048, 00:17:46.802 "data_size": 63488 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "name": "BaseBdev2", 00:17:46.802 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:46.802 "is_configured": true, 00:17:46.802 "data_offset": 2048, 00:17:46.802 "data_size": 63488 00:17:46.802 }, 00:17:46.802 { 00:17:46.802 "name": "BaseBdev3", 00:17:46.802 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:46.802 "is_configured": true, 00:17:46.802 "data_offset": 2048, 00:17:46.802 "data_size": 63488 00:17:46.802 } 00:17:46.802 ] 00:17:46.802 } 00:17:46.802 } 00:17:46.802 }' 00:17:46.802 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.062 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:47.062 BaseBdev2 00:17:47.062 BaseBdev3' 00:17:47.062 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.062 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:47.062 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.322 "name": "NewBaseBdev", 00:17:47.322 "aliases": [ 00:17:47.322 "9a6b93e3-cc0a-458f-b921-64515c1e864b" 00:17:47.322 ], 00:17:47.322 "product_name": "Malloc disk", 00:17:47.322 "block_size": 512, 00:17:47.322 "num_blocks": 65536, 00:17:47.322 "uuid": "9a6b93e3-cc0a-458f-b921-64515c1e864b", 00:17:47.322 "assigned_rate_limits": { 00:17:47.322 "rw_ios_per_sec": 0, 00:17:47.322 "rw_mbytes_per_sec": 0, 00:17:47.322 "r_mbytes_per_sec": 0, 00:17:47.322 "w_mbytes_per_sec": 0 00:17:47.322 }, 00:17:47.322 "claimed": true, 00:17:47.322 "claim_type": "exclusive_write", 00:17:47.322 "zoned": false, 00:17:47.322 "supported_io_types": { 00:17:47.322 "read": true, 00:17:47.322 "write": true, 00:17:47.322 "unmap": true, 00:17:47.322 "flush": true, 00:17:47.322 "reset": true, 00:17:47.322 "nvme_admin": false, 00:17:47.322 "nvme_io": false, 00:17:47.322 "nvme_io_md": false, 00:17:47.322 "write_zeroes": true, 00:17:47.322 "zcopy": true, 00:17:47.322 "get_zone_info": false, 00:17:47.322 "zone_management": false, 00:17:47.322 "zone_append": false, 00:17:47.322 "compare": false, 00:17:47.322 "compare_and_write": false, 00:17:47.322 "abort": true, 00:17:47.322 "seek_hole": false, 00:17:47.322 "seek_data": false, 00:17:47.322 "copy": true, 00:17:47.322 "nvme_iov_md": false 00:17:47.322 }, 00:17:47.322 "memory_domains": [ 00:17:47.322 { 00:17:47.322 "dma_device_id": "system", 00:17:47.322 "dma_device_type": 1 00:17:47.322 }, 00:17:47.322 { 00:17:47.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.322 "dma_device_type": 2 00:17:47.322 } 00:17:47.322 ], 00:17:47.322 "driver_specific": {} 00:17:47.322 }' 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.322 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:47.581 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.840 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.840 "name": "BaseBdev2", 00:17:47.840 "aliases": [ 00:17:47.840 "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441" 00:17:47.840 ], 00:17:47.840 "product_name": "Malloc disk", 00:17:47.840 "block_size": 512, 00:17:47.840 "num_blocks": 65536, 00:17:47.840 "uuid": "f86ef3ed-baf9-4d35-b6b0-dcadc6a24441", 00:17:47.840 "assigned_rate_limits": { 00:17:47.840 "rw_ios_per_sec": 0, 00:17:47.840 "rw_mbytes_per_sec": 0, 00:17:47.840 "r_mbytes_per_sec": 0, 00:17:47.840 "w_mbytes_per_sec": 0 00:17:47.840 }, 00:17:47.840 "claimed": true, 00:17:47.840 "claim_type": "exclusive_write", 00:17:47.840 "zoned": false, 00:17:47.840 "supported_io_types": { 00:17:47.840 "read": true, 00:17:47.840 "write": true, 00:17:47.840 "unmap": true, 00:17:47.840 "flush": true, 00:17:47.840 "reset": true, 00:17:47.840 "nvme_admin": false, 00:17:47.840 "nvme_io": false, 00:17:47.840 "nvme_io_md": false, 00:17:47.840 "write_zeroes": true, 00:17:47.840 "zcopy": true, 00:17:47.840 "get_zone_info": false, 00:17:47.840 "zone_management": false, 00:17:47.840 "zone_append": false, 00:17:47.840 "compare": false, 00:17:47.840 "compare_and_write": false, 00:17:47.840 "abort": true, 00:17:47.840 "seek_hole": false, 00:17:47.840 "seek_data": false, 00:17:47.840 "copy": true, 00:17:47.840 "nvme_iov_md": false 00:17:47.840 }, 00:17:47.840 "memory_domains": [ 00:17:47.840 { 00:17:47.840 "dma_device_id": "system", 00:17:47.840 "dma_device_type": 1 00:17:47.840 }, 00:17:47.840 { 00:17:47.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.841 "dma_device_type": 2 00:17:47.841 } 00:17:47.841 ], 00:17:47.841 "driver_specific": {} 00:17:47.841 }' 00:17:47.841 22:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.841 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.099 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:48.100 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.359 "name": "BaseBdev3", 00:17:48.359 "aliases": [ 00:17:48.359 "6b3834f6-5ec9-466e-80b3-0b3939b0425c" 00:17:48.359 ], 00:17:48.359 "product_name": "Malloc disk", 00:17:48.359 "block_size": 512, 00:17:48.359 "num_blocks": 65536, 00:17:48.359 "uuid": "6b3834f6-5ec9-466e-80b3-0b3939b0425c", 00:17:48.359 "assigned_rate_limits": { 00:17:48.359 "rw_ios_per_sec": 0, 00:17:48.359 "rw_mbytes_per_sec": 0, 00:17:48.359 "r_mbytes_per_sec": 0, 00:17:48.359 "w_mbytes_per_sec": 0 00:17:48.359 }, 00:17:48.359 "claimed": true, 00:17:48.359 "claim_type": "exclusive_write", 00:17:48.359 "zoned": false, 00:17:48.359 "supported_io_types": { 00:17:48.359 "read": true, 00:17:48.359 "write": true, 00:17:48.359 "unmap": true, 00:17:48.359 "flush": true, 00:17:48.359 "reset": true, 00:17:48.359 "nvme_admin": false, 00:17:48.359 "nvme_io": false, 00:17:48.359 "nvme_io_md": false, 00:17:48.359 "write_zeroes": true, 00:17:48.359 "zcopy": true, 00:17:48.359 "get_zone_info": false, 00:17:48.359 "zone_management": false, 00:17:48.359 "zone_append": false, 00:17:48.359 "compare": false, 00:17:48.359 "compare_and_write": false, 00:17:48.359 "abort": true, 00:17:48.359 "seek_hole": false, 00:17:48.359 "seek_data": false, 00:17:48.359 "copy": true, 00:17:48.359 "nvme_iov_md": false 00:17:48.359 }, 00:17:48.359 "memory_domains": [ 00:17:48.359 { 00:17:48.359 "dma_device_id": "system", 00:17:48.359 "dma_device_type": 1 00:17:48.359 }, 00:17:48.359 { 00:17:48.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.359 "dma_device_type": 2 00:17:48.359 } 00:17:48.359 ], 00:17:48.359 "driver_specific": {} 00:17:48.359 }' 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.359 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.619 22:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:48.879 [2024-07-12 22:24:59.043966] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:48.879 [2024-07-12 22:24:59.043993] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:48.879 [2024-07-12 22:24:59.044050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.879 [2024-07-12 22:24:59.044325] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:48.879 [2024-07-12 22:24:59.044339] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26471b0 name Existed_Raid, state offline 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3472118 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3472118 ']' 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3472118 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3472118 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3472118' 00:17:48.879 killing process with pid 3472118 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3472118 00:17:48.879 [2024-07-12 22:24:59.112307] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:48.879 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3472118 00:17:48.879 [2024-07-12 22:24:59.143207] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:49.140 22:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:49.140 00:17:49.140 real 0m28.405s 00:17:49.140 user 0m51.963s 00:17:49.140 sys 0m5.169s 00:17:49.140 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:49.140 22:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.140 ************************************ 00:17:49.140 END TEST raid_state_function_test_sb 00:17:49.140 ************************************ 00:17:49.140 22:24:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:49.140 22:24:59 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:49.140 22:24:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:49.140 22:24:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:49.140 22:24:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:49.140 ************************************ 00:17:49.140 START TEST raid_superblock_test 00:17:49.140 ************************************ 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3476411 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3476411 /var/tmp/spdk-raid.sock 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3476411 ']' 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:49.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:49.140 22:24:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.399 [2024-07-12 22:24:59.468371] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:17:49.399 [2024-07-12 22:24:59.468417] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3476411 ] 00:17:49.399 [2024-07-12 22:24:59.581330] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:49.399 [2024-07-12 22:24:59.687191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:49.659 [2024-07-12 22:24:59.749936] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:49.659 [2024-07-12 22:24:59.749974] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:50.228 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:50.487 malloc1 00:17:50.487 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:50.747 [2024-07-12 22:25:00.902419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:50.747 [2024-07-12 22:25:00.902467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:50.747 [2024-07-12 22:25:00.902491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb4f570 00:17:50.747 [2024-07-12 22:25:00.902503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:50.747 [2024-07-12 22:25:00.904325] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:50.747 [2024-07-12 22:25:00.904355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:50.747 pt1 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:50.747 22:25:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:51.007 malloc2 00:17:51.007 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:51.266 [2024-07-12 22:25:01.397789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:51.266 [2024-07-12 22:25:01.397840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.266 [2024-07-12 22:25:01.397860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb50970 00:17:51.266 [2024-07-12 22:25:01.397872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.266 [2024-07-12 22:25:01.399587] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.266 [2024-07-12 22:25:01.399615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:51.266 pt2 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:51.266 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:51.525 malloc3 00:17:51.525 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:51.784 [2024-07-12 22:25:01.889022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:51.784 [2024-07-12 22:25:01.889073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.784 [2024-07-12 22:25:01.889092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce7340 00:17:51.784 [2024-07-12 22:25:01.889104] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.784 [2024-07-12 22:25:01.890756] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.784 [2024-07-12 22:25:01.890784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:51.784 pt3 00:17:51.784 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:51.784 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:51.784 22:25:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:52.044 [2024-07-12 22:25:02.137690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:52.044 [2024-07-12 22:25:02.138952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:52.044 [2024-07-12 22:25:02.139013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:52.044 [2024-07-12 22:25:02.139168] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb47ea0 00:17:52.044 [2024-07-12 22:25:02.139180] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:52.044 [2024-07-12 22:25:02.139374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb4f240 00:17:52.044 [2024-07-12 22:25:02.139520] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb47ea0 00:17:52.044 [2024-07-12 22:25:02.139531] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb47ea0 00:17:52.044 [2024-07-12 22:25:02.139628] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.044 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:52.303 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.303 "name": "raid_bdev1", 00:17:52.303 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:17:52.303 "strip_size_kb": 0, 00:17:52.303 "state": "online", 00:17:52.303 "raid_level": "raid1", 00:17:52.303 "superblock": true, 00:17:52.303 "num_base_bdevs": 3, 00:17:52.303 "num_base_bdevs_discovered": 3, 00:17:52.303 "num_base_bdevs_operational": 3, 00:17:52.303 "base_bdevs_list": [ 00:17:52.303 { 00:17:52.303 "name": "pt1", 00:17:52.303 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.303 "is_configured": true, 00:17:52.303 "data_offset": 2048, 00:17:52.303 "data_size": 63488 00:17:52.303 }, 00:17:52.303 { 00:17:52.303 "name": "pt2", 00:17:52.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.303 "is_configured": true, 00:17:52.303 "data_offset": 2048, 00:17:52.303 "data_size": 63488 00:17:52.303 }, 00:17:52.303 { 00:17:52.303 "name": "pt3", 00:17:52.303 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.303 "is_configured": true, 00:17:52.303 "data_offset": 2048, 00:17:52.303 "data_size": 63488 00:17:52.303 } 00:17:52.303 ] 00:17:52.303 }' 00:17:52.303 22:25:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.303 22:25:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:52.872 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:53.131 [2024-07-12 22:25:03.248885] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:53.131 "name": "raid_bdev1", 00:17:53.131 "aliases": [ 00:17:53.131 "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8" 00:17:53.131 ], 00:17:53.131 "product_name": "Raid Volume", 00:17:53.131 "block_size": 512, 00:17:53.131 "num_blocks": 63488, 00:17:53.131 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:17:53.131 "assigned_rate_limits": { 00:17:53.131 "rw_ios_per_sec": 0, 00:17:53.131 "rw_mbytes_per_sec": 0, 00:17:53.131 "r_mbytes_per_sec": 0, 00:17:53.131 "w_mbytes_per_sec": 0 00:17:53.131 }, 00:17:53.131 "claimed": false, 00:17:53.131 "zoned": false, 00:17:53.131 "supported_io_types": { 00:17:53.131 "read": true, 00:17:53.131 "write": true, 00:17:53.131 "unmap": false, 00:17:53.131 "flush": false, 00:17:53.131 "reset": true, 00:17:53.131 "nvme_admin": false, 00:17:53.131 "nvme_io": false, 00:17:53.131 "nvme_io_md": false, 00:17:53.131 "write_zeroes": true, 00:17:53.131 "zcopy": false, 00:17:53.131 "get_zone_info": false, 00:17:53.131 "zone_management": false, 00:17:53.131 "zone_append": false, 00:17:53.131 "compare": false, 00:17:53.131 "compare_and_write": false, 00:17:53.131 "abort": false, 00:17:53.131 "seek_hole": false, 00:17:53.131 "seek_data": false, 00:17:53.131 "copy": false, 00:17:53.131 "nvme_iov_md": false 00:17:53.131 }, 00:17:53.131 "memory_domains": [ 00:17:53.131 { 00:17:53.131 "dma_device_id": "system", 00:17:53.131 "dma_device_type": 1 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.131 "dma_device_type": 2 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "dma_device_id": "system", 00:17:53.131 "dma_device_type": 1 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.131 "dma_device_type": 2 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "dma_device_id": "system", 00:17:53.131 "dma_device_type": 1 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.131 "dma_device_type": 2 00:17:53.131 } 00:17:53.131 ], 00:17:53.131 "driver_specific": { 00:17:53.131 "raid": { 00:17:53.131 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:17:53.131 "strip_size_kb": 0, 00:17:53.131 "state": "online", 00:17:53.131 "raid_level": "raid1", 00:17:53.131 "superblock": true, 00:17:53.131 "num_base_bdevs": 3, 00:17:53.131 "num_base_bdevs_discovered": 3, 00:17:53.131 "num_base_bdevs_operational": 3, 00:17:53.131 "base_bdevs_list": [ 00:17:53.131 { 00:17:53.131 "name": "pt1", 00:17:53.131 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:53.131 "is_configured": true, 00:17:53.131 "data_offset": 2048, 00:17:53.131 "data_size": 63488 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "name": "pt2", 00:17:53.131 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.131 "is_configured": true, 00:17:53.131 "data_offset": 2048, 00:17:53.131 "data_size": 63488 00:17:53.131 }, 00:17:53.131 { 00:17:53.131 "name": "pt3", 00:17:53.131 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:53.131 "is_configured": true, 00:17:53.131 "data_offset": 2048, 00:17:53.131 "data_size": 63488 00:17:53.131 } 00:17:53.131 ] 00:17:53.131 } 00:17:53.131 } 00:17:53.131 }' 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:53.131 pt2 00:17:53.131 pt3' 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:53.131 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.391 "name": "pt1", 00:17:53.391 "aliases": [ 00:17:53.391 "00000000-0000-0000-0000-000000000001" 00:17:53.391 ], 00:17:53.391 "product_name": "passthru", 00:17:53.391 "block_size": 512, 00:17:53.391 "num_blocks": 65536, 00:17:53.391 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:53.391 "assigned_rate_limits": { 00:17:53.391 "rw_ios_per_sec": 0, 00:17:53.391 "rw_mbytes_per_sec": 0, 00:17:53.391 "r_mbytes_per_sec": 0, 00:17:53.391 "w_mbytes_per_sec": 0 00:17:53.391 }, 00:17:53.391 "claimed": true, 00:17:53.391 "claim_type": "exclusive_write", 00:17:53.391 "zoned": false, 00:17:53.391 "supported_io_types": { 00:17:53.391 "read": true, 00:17:53.391 "write": true, 00:17:53.391 "unmap": true, 00:17:53.391 "flush": true, 00:17:53.391 "reset": true, 00:17:53.391 "nvme_admin": false, 00:17:53.391 "nvme_io": false, 00:17:53.391 "nvme_io_md": false, 00:17:53.391 "write_zeroes": true, 00:17:53.391 "zcopy": true, 00:17:53.391 "get_zone_info": false, 00:17:53.391 "zone_management": false, 00:17:53.391 "zone_append": false, 00:17:53.391 "compare": false, 00:17:53.391 "compare_and_write": false, 00:17:53.391 "abort": true, 00:17:53.391 "seek_hole": false, 00:17:53.391 "seek_data": false, 00:17:53.391 "copy": true, 00:17:53.391 "nvme_iov_md": false 00:17:53.391 }, 00:17:53.391 "memory_domains": [ 00:17:53.391 { 00:17:53.391 "dma_device_id": "system", 00:17:53.391 "dma_device_type": 1 00:17:53.391 }, 00:17:53.391 { 00:17:53.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.391 "dma_device_type": 2 00:17:53.391 } 00:17:53.391 ], 00:17:53.391 "driver_specific": { 00:17:53.391 "passthru": { 00:17:53.391 "name": "pt1", 00:17:53.391 "base_bdev_name": "malloc1" 00:17:53.391 } 00:17:53.391 } 00:17:53.391 }' 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.391 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:53.661 22:25:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.924 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.924 "name": "pt2", 00:17:53.924 "aliases": [ 00:17:53.924 "00000000-0000-0000-0000-000000000002" 00:17:53.924 ], 00:17:53.924 "product_name": "passthru", 00:17:53.924 "block_size": 512, 00:17:53.924 "num_blocks": 65536, 00:17:53.924 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.924 "assigned_rate_limits": { 00:17:53.924 "rw_ios_per_sec": 0, 00:17:53.924 "rw_mbytes_per_sec": 0, 00:17:53.924 "r_mbytes_per_sec": 0, 00:17:53.924 "w_mbytes_per_sec": 0 00:17:53.924 }, 00:17:53.924 "claimed": true, 00:17:53.924 "claim_type": "exclusive_write", 00:17:53.924 "zoned": false, 00:17:53.924 "supported_io_types": { 00:17:53.924 "read": true, 00:17:53.924 "write": true, 00:17:53.924 "unmap": true, 00:17:53.924 "flush": true, 00:17:53.924 "reset": true, 00:17:53.924 "nvme_admin": false, 00:17:53.924 "nvme_io": false, 00:17:53.924 "nvme_io_md": false, 00:17:53.924 "write_zeroes": true, 00:17:53.924 "zcopy": true, 00:17:53.924 "get_zone_info": false, 00:17:53.924 "zone_management": false, 00:17:53.924 "zone_append": false, 00:17:53.924 "compare": false, 00:17:53.924 "compare_and_write": false, 00:17:53.924 "abort": true, 00:17:53.924 "seek_hole": false, 00:17:53.924 "seek_data": false, 00:17:53.924 "copy": true, 00:17:53.924 "nvme_iov_md": false 00:17:53.924 }, 00:17:53.924 "memory_domains": [ 00:17:53.924 { 00:17:53.924 "dma_device_id": "system", 00:17:53.924 "dma_device_type": 1 00:17:53.924 }, 00:17:53.924 { 00:17:53.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.924 "dma_device_type": 2 00:17:53.924 } 00:17:53.924 ], 00:17:53.924 "driver_specific": { 00:17:53.924 "passthru": { 00:17:53.924 "name": "pt2", 00:17:53.924 "base_bdev_name": "malloc2" 00:17:53.924 } 00:17:53.925 } 00:17:53.925 }' 00:17:53.925 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.925 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.925 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.925 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.184 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.444 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.444 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:54.444 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:54.444 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.703 "name": "pt3", 00:17:54.703 "aliases": [ 00:17:54.703 "00000000-0000-0000-0000-000000000003" 00:17:54.703 ], 00:17:54.703 "product_name": "passthru", 00:17:54.703 "block_size": 512, 00:17:54.703 "num_blocks": 65536, 00:17:54.703 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:54.703 "assigned_rate_limits": { 00:17:54.703 "rw_ios_per_sec": 0, 00:17:54.703 "rw_mbytes_per_sec": 0, 00:17:54.703 "r_mbytes_per_sec": 0, 00:17:54.703 "w_mbytes_per_sec": 0 00:17:54.703 }, 00:17:54.703 "claimed": true, 00:17:54.703 "claim_type": "exclusive_write", 00:17:54.703 "zoned": false, 00:17:54.703 "supported_io_types": { 00:17:54.703 "read": true, 00:17:54.703 "write": true, 00:17:54.703 "unmap": true, 00:17:54.703 "flush": true, 00:17:54.703 "reset": true, 00:17:54.703 "nvme_admin": false, 00:17:54.703 "nvme_io": false, 00:17:54.703 "nvme_io_md": false, 00:17:54.703 "write_zeroes": true, 00:17:54.703 "zcopy": true, 00:17:54.703 "get_zone_info": false, 00:17:54.703 "zone_management": false, 00:17:54.703 "zone_append": false, 00:17:54.703 "compare": false, 00:17:54.703 "compare_and_write": false, 00:17:54.703 "abort": true, 00:17:54.703 "seek_hole": false, 00:17:54.703 "seek_data": false, 00:17:54.703 "copy": true, 00:17:54.703 "nvme_iov_md": false 00:17:54.703 }, 00:17:54.703 "memory_domains": [ 00:17:54.703 { 00:17:54.703 "dma_device_id": "system", 00:17:54.703 "dma_device_type": 1 00:17:54.703 }, 00:17:54.703 { 00:17:54.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.703 "dma_device_type": 2 00:17:54.703 } 00:17:54.703 ], 00:17:54.703 "driver_specific": { 00:17:54.703 "passthru": { 00:17:54.703 "name": "pt3", 00:17:54.703 "base_bdev_name": "malloc3" 00:17:54.703 } 00:17:54.703 } 00:17:54.703 }' 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.703 22:25:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.703 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.962 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.962 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:54.962 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:54.962 [2024-07-12 22:25:05.282264] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:55.222 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 00:17:55.222 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 ']' 00:17:55.222 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:55.222 [2024-07-12 22:25:05.526639] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:55.222 [2024-07-12 22:25:05.526665] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:55.222 [2024-07-12 22:25:05.526717] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:55.222 [2024-07-12 22:25:05.526787] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:55.222 [2024-07-12 22:25:05.526800] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb47ea0 name raid_bdev1, state offline 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:55.481 22:25:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:55.740 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:55.740 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:55.999 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:55.999 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:56.258 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:56.258 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:56.518 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.518 [2024-07-12 22:25:06.834041] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:56.518 [2024-07-12 22:25:06.835379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:56.518 [2024-07-12 22:25:06.835422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:56.518 [2024-07-12 22:25:06.835469] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:56.518 [2024-07-12 22:25:06.835507] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:56.518 [2024-07-12 22:25:06.835537] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:56.518 [2024-07-12 22:25:06.835555] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:56.518 [2024-07-12 22:25:06.835565] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcf2ff0 name raid_bdev1, state configuring 00:17:56.518 request: 00:17:56.518 { 00:17:56.518 "name": "raid_bdev1", 00:17:56.518 "raid_level": "raid1", 00:17:56.518 "base_bdevs": [ 00:17:56.518 "malloc1", 00:17:56.518 "malloc2", 00:17:56.518 "malloc3" 00:17:56.518 ], 00:17:56.518 "superblock": false, 00:17:56.518 "method": "bdev_raid_create", 00:17:56.518 "req_id": 1 00:17:56.518 } 00:17:56.518 Got JSON-RPC error response 00:17:56.518 response: 00:17:56.518 { 00:17:56.518 "code": -17, 00:17:56.518 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:56.518 } 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.777 22:25:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:57.345 [2024-07-12 22:25:07.579945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:57.345 [2024-07-12 22:25:07.579996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:57.345 [2024-07-12 22:25:07.580019] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb4f7a0 00:17:57.345 [2024-07-12 22:25:07.580032] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:57.345 [2024-07-12 22:25:07.581679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:57.345 [2024-07-12 22:25:07.581707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:57.345 [2024-07-12 22:25:07.581774] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:57.345 [2024-07-12 22:25:07.581801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:57.345 pt1 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.345 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.346 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.346 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.346 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.346 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.605 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.605 "name": "raid_bdev1", 00:17:57.605 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:17:57.605 "strip_size_kb": 0, 00:17:57.605 "state": "configuring", 00:17:57.605 "raid_level": "raid1", 00:17:57.605 "superblock": true, 00:17:57.605 "num_base_bdevs": 3, 00:17:57.605 "num_base_bdevs_discovered": 1, 00:17:57.605 "num_base_bdevs_operational": 3, 00:17:57.605 "base_bdevs_list": [ 00:17:57.605 { 00:17:57.605 "name": "pt1", 00:17:57.605 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:57.605 "is_configured": true, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 }, 00:17:57.605 { 00:17:57.605 "name": null, 00:17:57.605 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.605 "is_configured": false, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 }, 00:17:57.605 { 00:17:57.605 "name": null, 00:17:57.605 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:57.605 "is_configured": false, 00:17:57.605 "data_offset": 2048, 00:17:57.605 "data_size": 63488 00:17:57.605 } 00:17:57.605 ] 00:17:57.605 }' 00:17:57.605 22:25:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.605 22:25:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.173 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:58.173 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:58.432 [2024-07-12 22:25:08.598647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:58.432 [2024-07-12 22:25:08.598696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.432 [2024-07-12 22:25:08.598715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb46a10 00:17:58.432 [2024-07-12 22:25:08.598727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.432 [2024-07-12 22:25:08.599066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.432 [2024-07-12 22:25:08.599083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:58.432 [2024-07-12 22:25:08.599146] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:58.432 [2024-07-12 22:25:08.599164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:58.432 pt2 00:17:58.432 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:58.692 [2024-07-12 22:25:08.843297] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:58.692 22:25:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.951 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.951 "name": "raid_bdev1", 00:17:58.951 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:17:58.951 "strip_size_kb": 0, 00:17:58.951 "state": "configuring", 00:17:58.951 "raid_level": "raid1", 00:17:58.951 "superblock": true, 00:17:58.951 "num_base_bdevs": 3, 00:17:58.951 "num_base_bdevs_discovered": 1, 00:17:58.951 "num_base_bdevs_operational": 3, 00:17:58.951 "base_bdevs_list": [ 00:17:58.951 { 00:17:58.951 "name": "pt1", 00:17:58.951 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:58.951 "is_configured": true, 00:17:58.951 "data_offset": 2048, 00:17:58.951 "data_size": 63488 00:17:58.951 }, 00:17:58.951 { 00:17:58.951 "name": null, 00:17:58.951 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:58.951 "is_configured": false, 00:17:58.951 "data_offset": 2048, 00:17:58.951 "data_size": 63488 00:17:58.951 }, 00:17:58.951 { 00:17:58.951 "name": null, 00:17:58.951 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:58.951 "is_configured": false, 00:17:58.951 "data_offset": 2048, 00:17:58.951 "data_size": 63488 00:17:58.951 } 00:17:58.951 ] 00:17:58.951 }' 00:17:58.951 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.951 22:25:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.520 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:59.520 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:59.520 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:59.780 [2024-07-12 22:25:09.946219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:59.780 [2024-07-12 22:25:09.946267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:59.780 [2024-07-12 22:25:09.946290] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb4fa10 00:17:59.780 [2024-07-12 22:25:09.946304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:59.780 [2024-07-12 22:25:09.946635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:59.780 [2024-07-12 22:25:09.946653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:59.780 [2024-07-12 22:25:09.946717] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:59.780 [2024-07-12 22:25:09.946735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:59.780 pt2 00:17:59.780 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:59.780 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:59.780 22:25:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:00.040 [2024-07-12 22:25:10.186860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:00.040 [2024-07-12 22:25:10.186903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.040 [2024-07-12 22:25:10.186920] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb466c0 00:18:00.040 [2024-07-12 22:25:10.186940] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.040 [2024-07-12 22:25:10.187251] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.040 [2024-07-12 22:25:10.187269] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:00.040 [2024-07-12 22:25:10.187328] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:00.040 [2024-07-12 22:25:10.187346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:00.040 [2024-07-12 22:25:10.187456] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xce9c00 00:18:00.040 [2024-07-12 22:25:10.187466] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:00.040 [2024-07-12 22:25:10.187631] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb49610 00:18:00.040 [2024-07-12 22:25:10.187757] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xce9c00 00:18:00.040 [2024-07-12 22:25:10.187767] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xce9c00 00:18:00.040 [2024-07-12 22:25:10.187865] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:00.040 pt3 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.040 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.299 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.299 "name": "raid_bdev1", 00:18:00.299 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:00.299 "strip_size_kb": 0, 00:18:00.299 "state": "online", 00:18:00.299 "raid_level": "raid1", 00:18:00.299 "superblock": true, 00:18:00.299 "num_base_bdevs": 3, 00:18:00.299 "num_base_bdevs_discovered": 3, 00:18:00.299 "num_base_bdevs_operational": 3, 00:18:00.299 "base_bdevs_list": [ 00:18:00.299 { 00:18:00.299 "name": "pt1", 00:18:00.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:00.299 "is_configured": true, 00:18:00.299 "data_offset": 2048, 00:18:00.299 "data_size": 63488 00:18:00.299 }, 00:18:00.299 { 00:18:00.299 "name": "pt2", 00:18:00.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:00.299 "is_configured": true, 00:18:00.299 "data_offset": 2048, 00:18:00.299 "data_size": 63488 00:18:00.299 }, 00:18:00.299 { 00:18:00.299 "name": "pt3", 00:18:00.299 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:00.299 "is_configured": true, 00:18:00.299 "data_offset": 2048, 00:18:00.299 "data_size": 63488 00:18:00.299 } 00:18:00.299 ] 00:18:00.299 }' 00:18:00.299 22:25:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.299 22:25:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:00.870 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:01.502 [2024-07-12 22:25:11.522682] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:01.502 "name": "raid_bdev1", 00:18:01.502 "aliases": [ 00:18:01.502 "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8" 00:18:01.502 ], 00:18:01.502 "product_name": "Raid Volume", 00:18:01.502 "block_size": 512, 00:18:01.502 "num_blocks": 63488, 00:18:01.502 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:01.502 "assigned_rate_limits": { 00:18:01.502 "rw_ios_per_sec": 0, 00:18:01.502 "rw_mbytes_per_sec": 0, 00:18:01.502 "r_mbytes_per_sec": 0, 00:18:01.502 "w_mbytes_per_sec": 0 00:18:01.502 }, 00:18:01.502 "claimed": false, 00:18:01.502 "zoned": false, 00:18:01.502 "supported_io_types": { 00:18:01.502 "read": true, 00:18:01.502 "write": true, 00:18:01.502 "unmap": false, 00:18:01.502 "flush": false, 00:18:01.502 "reset": true, 00:18:01.502 "nvme_admin": false, 00:18:01.502 "nvme_io": false, 00:18:01.502 "nvme_io_md": false, 00:18:01.502 "write_zeroes": true, 00:18:01.502 "zcopy": false, 00:18:01.502 "get_zone_info": false, 00:18:01.502 "zone_management": false, 00:18:01.502 "zone_append": false, 00:18:01.502 "compare": false, 00:18:01.502 "compare_and_write": false, 00:18:01.502 "abort": false, 00:18:01.502 "seek_hole": false, 00:18:01.502 "seek_data": false, 00:18:01.502 "copy": false, 00:18:01.502 "nvme_iov_md": false 00:18:01.502 }, 00:18:01.502 "memory_domains": [ 00:18:01.502 { 00:18:01.502 "dma_device_id": "system", 00:18:01.502 "dma_device_type": 1 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.502 "dma_device_type": 2 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "system", 00:18:01.502 "dma_device_type": 1 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.502 "dma_device_type": 2 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "system", 00:18:01.502 "dma_device_type": 1 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.502 "dma_device_type": 2 00:18:01.502 } 00:18:01.502 ], 00:18:01.502 "driver_specific": { 00:18:01.502 "raid": { 00:18:01.502 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:01.502 "strip_size_kb": 0, 00:18:01.502 "state": "online", 00:18:01.502 "raid_level": "raid1", 00:18:01.502 "superblock": true, 00:18:01.502 "num_base_bdevs": 3, 00:18:01.502 "num_base_bdevs_discovered": 3, 00:18:01.502 "num_base_bdevs_operational": 3, 00:18:01.502 "base_bdevs_list": [ 00:18:01.502 { 00:18:01.502 "name": "pt1", 00:18:01.502 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:01.502 "is_configured": true, 00:18:01.502 "data_offset": 2048, 00:18:01.502 "data_size": 63488 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "name": "pt2", 00:18:01.502 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:01.502 "is_configured": true, 00:18:01.502 "data_offset": 2048, 00:18:01.502 "data_size": 63488 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "name": "pt3", 00:18:01.502 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:01.502 "is_configured": true, 00:18:01.502 "data_offset": 2048, 00:18:01.502 "data_size": 63488 00:18:01.502 } 00:18:01.502 ] 00:18:01.502 } 00:18:01.502 } 00:18:01.502 }' 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:01.502 pt2 00:18:01.502 pt3' 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.502 "name": "pt1", 00:18:01.502 "aliases": [ 00:18:01.502 "00000000-0000-0000-0000-000000000001" 00:18:01.502 ], 00:18:01.502 "product_name": "passthru", 00:18:01.502 "block_size": 512, 00:18:01.502 "num_blocks": 65536, 00:18:01.502 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:01.502 "assigned_rate_limits": { 00:18:01.502 "rw_ios_per_sec": 0, 00:18:01.502 "rw_mbytes_per_sec": 0, 00:18:01.502 "r_mbytes_per_sec": 0, 00:18:01.502 "w_mbytes_per_sec": 0 00:18:01.502 }, 00:18:01.502 "claimed": true, 00:18:01.502 "claim_type": "exclusive_write", 00:18:01.502 "zoned": false, 00:18:01.502 "supported_io_types": { 00:18:01.502 "read": true, 00:18:01.502 "write": true, 00:18:01.502 "unmap": true, 00:18:01.502 "flush": true, 00:18:01.502 "reset": true, 00:18:01.502 "nvme_admin": false, 00:18:01.502 "nvme_io": false, 00:18:01.502 "nvme_io_md": false, 00:18:01.502 "write_zeroes": true, 00:18:01.502 "zcopy": true, 00:18:01.502 "get_zone_info": false, 00:18:01.502 "zone_management": false, 00:18:01.502 "zone_append": false, 00:18:01.502 "compare": false, 00:18:01.502 "compare_and_write": false, 00:18:01.502 "abort": true, 00:18:01.502 "seek_hole": false, 00:18:01.502 "seek_data": false, 00:18:01.502 "copy": true, 00:18:01.502 "nvme_iov_md": false 00:18:01.502 }, 00:18:01.502 "memory_domains": [ 00:18:01.502 { 00:18:01.502 "dma_device_id": "system", 00:18:01.502 "dma_device_type": 1 00:18:01.502 }, 00:18:01.502 { 00:18:01.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.502 "dma_device_type": 2 00:18:01.502 } 00:18:01.502 ], 00:18:01.502 "driver_specific": { 00:18:01.502 "passthru": { 00:18:01.502 "name": "pt1", 00:18:01.502 "base_bdev_name": "malloc1" 00:18:01.502 } 00:18:01.502 } 00:18:01.502 }' 00:18:01.502 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.762 22:25:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.762 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.762 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.762 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.762 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.021 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.021 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.021 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:02.021 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.589 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.589 "name": "pt2", 00:18:02.589 "aliases": [ 00:18:02.589 "00000000-0000-0000-0000-000000000002" 00:18:02.589 ], 00:18:02.589 "product_name": "passthru", 00:18:02.589 "block_size": 512, 00:18:02.589 "num_blocks": 65536, 00:18:02.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:02.589 "assigned_rate_limits": { 00:18:02.589 "rw_ios_per_sec": 0, 00:18:02.589 "rw_mbytes_per_sec": 0, 00:18:02.589 "r_mbytes_per_sec": 0, 00:18:02.589 "w_mbytes_per_sec": 0 00:18:02.589 }, 00:18:02.589 "claimed": true, 00:18:02.589 "claim_type": "exclusive_write", 00:18:02.589 "zoned": false, 00:18:02.589 "supported_io_types": { 00:18:02.589 "read": true, 00:18:02.589 "write": true, 00:18:02.589 "unmap": true, 00:18:02.589 "flush": true, 00:18:02.589 "reset": true, 00:18:02.589 "nvme_admin": false, 00:18:02.589 "nvme_io": false, 00:18:02.589 "nvme_io_md": false, 00:18:02.589 "write_zeroes": true, 00:18:02.589 "zcopy": true, 00:18:02.589 "get_zone_info": false, 00:18:02.589 "zone_management": false, 00:18:02.589 "zone_append": false, 00:18:02.589 "compare": false, 00:18:02.589 "compare_and_write": false, 00:18:02.589 "abort": true, 00:18:02.589 "seek_hole": false, 00:18:02.589 "seek_data": false, 00:18:02.589 "copy": true, 00:18:02.589 "nvme_iov_md": false 00:18:02.589 }, 00:18:02.589 "memory_domains": [ 00:18:02.589 { 00:18:02.590 "dma_device_id": "system", 00:18:02.590 "dma_device_type": 1 00:18:02.590 }, 00:18:02.590 { 00:18:02.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.590 "dma_device_type": 2 00:18:02.590 } 00:18:02.590 ], 00:18:02.590 "driver_specific": { 00:18:02.590 "passthru": { 00:18:02.590 "name": "pt2", 00:18:02.590 "base_bdev_name": "malloc2" 00:18:02.590 } 00:18:02.590 } 00:18:02.590 }' 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.590 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.848 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.848 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.848 22:25:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.848 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.848 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.848 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:02.848 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.106 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.106 "name": "pt3", 00:18:03.106 "aliases": [ 00:18:03.106 "00000000-0000-0000-0000-000000000003" 00:18:03.106 ], 00:18:03.106 "product_name": "passthru", 00:18:03.106 "block_size": 512, 00:18:03.106 "num_blocks": 65536, 00:18:03.106 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:03.106 "assigned_rate_limits": { 00:18:03.106 "rw_ios_per_sec": 0, 00:18:03.106 "rw_mbytes_per_sec": 0, 00:18:03.106 "r_mbytes_per_sec": 0, 00:18:03.106 "w_mbytes_per_sec": 0 00:18:03.106 }, 00:18:03.106 "claimed": true, 00:18:03.106 "claim_type": "exclusive_write", 00:18:03.106 "zoned": false, 00:18:03.106 "supported_io_types": { 00:18:03.106 "read": true, 00:18:03.106 "write": true, 00:18:03.106 "unmap": true, 00:18:03.106 "flush": true, 00:18:03.106 "reset": true, 00:18:03.106 "nvme_admin": false, 00:18:03.106 "nvme_io": false, 00:18:03.106 "nvme_io_md": false, 00:18:03.106 "write_zeroes": true, 00:18:03.106 "zcopy": true, 00:18:03.106 "get_zone_info": false, 00:18:03.106 "zone_management": false, 00:18:03.106 "zone_append": false, 00:18:03.106 "compare": false, 00:18:03.106 "compare_and_write": false, 00:18:03.106 "abort": true, 00:18:03.106 "seek_hole": false, 00:18:03.106 "seek_data": false, 00:18:03.106 "copy": true, 00:18:03.106 "nvme_iov_md": false 00:18:03.106 }, 00:18:03.106 "memory_domains": [ 00:18:03.106 { 00:18:03.106 "dma_device_id": "system", 00:18:03.106 "dma_device_type": 1 00:18:03.106 }, 00:18:03.106 { 00:18:03.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.106 "dma_device_type": 2 00:18:03.106 } 00:18:03.106 ], 00:18:03.106 "driver_specific": { 00:18:03.106 "passthru": { 00:18:03.106 "name": "pt3", 00:18:03.106 "base_bdev_name": "malloc3" 00:18:03.107 } 00:18:03.107 } 00:18:03.107 }' 00:18:03.107 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.107 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.107 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.107 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.107 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:03.366 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:03.626 [2024-07-12 22:25:13.913151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:03.626 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 '!=' a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 ']' 00:18:03.626 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:18:03.626 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:03.626 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:03.626 22:25:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:03.885 [2024-07-12 22:25:14.165543] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.885 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:04.151 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.151 "name": "raid_bdev1", 00:18:04.151 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:04.151 "strip_size_kb": 0, 00:18:04.151 "state": "online", 00:18:04.151 "raid_level": "raid1", 00:18:04.151 "superblock": true, 00:18:04.151 "num_base_bdevs": 3, 00:18:04.151 "num_base_bdevs_discovered": 2, 00:18:04.151 "num_base_bdevs_operational": 2, 00:18:04.151 "base_bdevs_list": [ 00:18:04.151 { 00:18:04.151 "name": null, 00:18:04.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.151 "is_configured": false, 00:18:04.151 "data_offset": 2048, 00:18:04.151 "data_size": 63488 00:18:04.151 }, 00:18:04.151 { 00:18:04.151 "name": "pt2", 00:18:04.151 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:04.151 "is_configured": true, 00:18:04.151 "data_offset": 2048, 00:18:04.151 "data_size": 63488 00:18:04.151 }, 00:18:04.151 { 00:18:04.151 "name": "pt3", 00:18:04.151 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:04.151 "is_configured": true, 00:18:04.151 "data_offset": 2048, 00:18:04.151 "data_size": 63488 00:18:04.151 } 00:18:04.151 ] 00:18:04.151 }' 00:18:04.151 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.151 22:25:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.718 22:25:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:04.976 [2024-07-12 22:25:15.128070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:04.976 [2024-07-12 22:25:15.128095] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:04.976 [2024-07-12 22:25:15.128148] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:04.976 [2024-07-12 22:25:15.128201] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:04.976 [2024-07-12 22:25:15.128213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce9c00 name raid_bdev1, state offline 00:18:04.976 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:04.976 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.234 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:05.234 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:05.234 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:05.234 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.234 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:05.493 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:05.493 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.493 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:05.751 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:05.751 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.751 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:05.751 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:05.751 22:25:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:06.010 [2024-07-12 22:25:16.106615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:06.010 [2024-07-12 22:25:16.106658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:06.010 [2024-07-12 22:25:16.106675] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb47310 00:18:06.010 [2024-07-12 22:25:16.106688] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:06.010 [2024-07-12 22:25:16.108278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:06.010 [2024-07-12 22:25:16.108310] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:06.010 [2024-07-12 22:25:16.108373] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:06.010 [2024-07-12 22:25:16.108399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:06.010 pt2 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.010 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:06.268 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.268 "name": "raid_bdev1", 00:18:06.268 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:06.268 "strip_size_kb": 0, 00:18:06.268 "state": "configuring", 00:18:06.268 "raid_level": "raid1", 00:18:06.268 "superblock": true, 00:18:06.268 "num_base_bdevs": 3, 00:18:06.268 "num_base_bdevs_discovered": 1, 00:18:06.268 "num_base_bdevs_operational": 2, 00:18:06.268 "base_bdevs_list": [ 00:18:06.268 { 00:18:06.268 "name": null, 00:18:06.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.268 "is_configured": false, 00:18:06.268 "data_offset": 2048, 00:18:06.268 "data_size": 63488 00:18:06.268 }, 00:18:06.268 { 00:18:06.268 "name": "pt2", 00:18:06.268 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:06.268 "is_configured": true, 00:18:06.268 "data_offset": 2048, 00:18:06.268 "data_size": 63488 00:18:06.269 }, 00:18:06.269 { 00:18:06.269 "name": null, 00:18:06.269 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:06.269 "is_configured": false, 00:18:06.269 "data_offset": 2048, 00:18:06.269 "data_size": 63488 00:18:06.269 } 00:18:06.269 ] 00:18:06.269 }' 00:18:06.269 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.269 22:25:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.834 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:06.834 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:06.834 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:18:06.834 22:25:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:07.093 [2024-07-12 22:25:17.189484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:07.093 [2024-07-12 22:25:17.189534] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:07.093 [2024-07-12 22:25:17.189555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb45ec0 00:18:07.093 [2024-07-12 22:25:17.189568] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:07.093 [2024-07-12 22:25:17.189899] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:07.093 [2024-07-12 22:25:17.189916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:07.093 [2024-07-12 22:25:17.189987] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:07.093 [2024-07-12 22:25:17.190007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:07.093 [2024-07-12 22:25:17.190114] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xce7cc0 00:18:07.093 [2024-07-12 22:25:17.190124] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:07.093 [2024-07-12 22:25:17.190288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce86d0 00:18:07.093 [2024-07-12 22:25:17.190413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xce7cc0 00:18:07.093 [2024-07-12 22:25:17.190423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xce7cc0 00:18:07.093 [2024-07-12 22:25:17.190520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:07.093 pt3 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.093 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:07.352 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.352 "name": "raid_bdev1", 00:18:07.352 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:07.352 "strip_size_kb": 0, 00:18:07.352 "state": "online", 00:18:07.352 "raid_level": "raid1", 00:18:07.352 "superblock": true, 00:18:07.352 "num_base_bdevs": 3, 00:18:07.352 "num_base_bdevs_discovered": 2, 00:18:07.352 "num_base_bdevs_operational": 2, 00:18:07.352 "base_bdevs_list": [ 00:18:07.352 { 00:18:07.352 "name": null, 00:18:07.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.352 "is_configured": false, 00:18:07.352 "data_offset": 2048, 00:18:07.352 "data_size": 63488 00:18:07.352 }, 00:18:07.352 { 00:18:07.352 "name": "pt2", 00:18:07.352 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:07.352 "is_configured": true, 00:18:07.352 "data_offset": 2048, 00:18:07.352 "data_size": 63488 00:18:07.352 }, 00:18:07.352 { 00:18:07.352 "name": "pt3", 00:18:07.352 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:07.352 "is_configured": true, 00:18:07.352 "data_offset": 2048, 00:18:07.352 "data_size": 63488 00:18:07.352 } 00:18:07.352 ] 00:18:07.352 }' 00:18:07.352 22:25:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.352 22:25:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.920 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:08.179 [2024-07-12 22:25:18.268345] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:08.179 [2024-07-12 22:25:18.268370] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:08.179 [2024-07-12 22:25:18.268425] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:08.179 [2024-07-12 22:25:18.268477] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:08.179 [2024-07-12 22:25:18.268489] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce7cc0 name raid_bdev1, state offline 00:18:08.179 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.179 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:08.437 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:08.437 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:08.437 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:18:08.437 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:18:08.437 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:08.696 22:25:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:08.696 [2024-07-12 22:25:19.006262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:08.696 [2024-07-12 22:25:19.006305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.696 [2024-07-12 22:25:19.006321] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb45ec0 00:18:08.696 [2024-07-12 22:25:19.006333] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.696 [2024-07-12 22:25:19.007911] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.696 [2024-07-12 22:25:19.007947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:08.696 [2024-07-12 22:25:19.008010] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:08.696 [2024-07-12 22:25:19.008035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:08.696 [2024-07-12 22:25:19.008129] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:08.696 [2024-07-12 22:25:19.008142] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:08.696 [2024-07-12 22:25:19.008157] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce7f40 name raid_bdev1, state configuring 00:18:08.696 [2024-07-12 22:25:19.008180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:08.696 pt1 00:18:08.955 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:18:08.955 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.956 "name": "raid_bdev1", 00:18:08.956 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:08.956 "strip_size_kb": 0, 00:18:08.956 "state": "configuring", 00:18:08.956 "raid_level": "raid1", 00:18:08.956 "superblock": true, 00:18:08.956 "num_base_bdevs": 3, 00:18:08.956 "num_base_bdevs_discovered": 1, 00:18:08.956 "num_base_bdevs_operational": 2, 00:18:08.956 "base_bdevs_list": [ 00:18:08.956 { 00:18:08.956 "name": null, 00:18:08.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.956 "is_configured": false, 00:18:08.956 "data_offset": 2048, 00:18:08.956 "data_size": 63488 00:18:08.956 }, 00:18:08.956 { 00:18:08.956 "name": "pt2", 00:18:08.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:08.956 "is_configured": true, 00:18:08.956 "data_offset": 2048, 00:18:08.956 "data_size": 63488 00:18:08.956 }, 00:18:08.956 { 00:18:08.956 "name": null, 00:18:08.956 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:08.956 "is_configured": false, 00:18:08.956 "data_offset": 2048, 00:18:08.956 "data_size": 63488 00:18:08.956 } 00:18:08.956 ] 00:18:08.956 }' 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.956 22:25:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.892 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:09.892 22:25:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:09.892 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:09.892 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:10.150 [2024-07-12 22:25:20.345829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:10.150 [2024-07-12 22:25:20.345883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:10.150 [2024-07-12 22:25:20.345902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb490c0 00:18:10.150 [2024-07-12 22:25:20.345915] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:10.150 [2024-07-12 22:25:20.346273] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:10.150 [2024-07-12 22:25:20.346292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:10.150 [2024-07-12 22:25:20.346357] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:10.150 [2024-07-12 22:25:20.346376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:10.150 [2024-07-12 22:25:20.346477] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb49a40 00:18:10.150 [2024-07-12 22:25:20.346488] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:10.150 [2024-07-12 22:25:20.346652] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce86c0 00:18:10.150 [2024-07-12 22:25:20.346778] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb49a40 00:18:10.150 [2024-07-12 22:25:20.346788] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb49a40 00:18:10.150 [2024-07-12 22:25:20.346883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.151 pt3 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.151 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:10.409 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.409 "name": "raid_bdev1", 00:18:10.409 "uuid": "a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8", 00:18:10.409 "strip_size_kb": 0, 00:18:10.409 "state": "online", 00:18:10.409 "raid_level": "raid1", 00:18:10.409 "superblock": true, 00:18:10.409 "num_base_bdevs": 3, 00:18:10.409 "num_base_bdevs_discovered": 2, 00:18:10.409 "num_base_bdevs_operational": 2, 00:18:10.409 "base_bdevs_list": [ 00:18:10.409 { 00:18:10.409 "name": null, 00:18:10.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.409 "is_configured": false, 00:18:10.409 "data_offset": 2048, 00:18:10.409 "data_size": 63488 00:18:10.409 }, 00:18:10.409 { 00:18:10.409 "name": "pt2", 00:18:10.409 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:10.409 "is_configured": true, 00:18:10.409 "data_offset": 2048, 00:18:10.409 "data_size": 63488 00:18:10.409 }, 00:18:10.409 { 00:18:10.409 "name": "pt3", 00:18:10.409 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:10.409 "is_configured": true, 00:18:10.409 "data_offset": 2048, 00:18:10.409 "data_size": 63488 00:18:10.409 } 00:18:10.409 ] 00:18:10.409 }' 00:18:10.409 22:25:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.409 22:25:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.973 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:10.973 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:11.231 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:11.231 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:11.231 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:11.490 [2024-07-12 22:25:21.681628] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 '!=' a2cf5096-19bf-4bf0-89f4-6c4caf6ce1b8 ']' 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3476411 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3476411 ']' 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3476411 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3476411 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3476411' 00:18:11.490 killing process with pid 3476411 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3476411 00:18:11.490 [2024-07-12 22:25:21.753037] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:11.490 [2024-07-12 22:25:21.753091] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.490 [2024-07-12 22:25:21.753147] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:11.490 [2024-07-12 22:25:21.753158] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb49a40 name raid_bdev1, state offline 00:18:11.490 22:25:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3476411 00:18:11.490 [2024-07-12 22:25:21.784578] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:11.748 22:25:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:11.748 00:18:11.748 real 0m22.575s 00:18:11.748 user 0m41.247s 00:18:11.748 sys 0m3.965s 00:18:11.748 22:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:11.748 22:25:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.748 ************************************ 00:18:11.748 END TEST raid_superblock_test 00:18:11.748 ************************************ 00:18:11.748 22:25:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:11.748 22:25:22 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:11.748 22:25:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:11.748 22:25:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:11.748 22:25:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:12.007 ************************************ 00:18:12.007 START TEST raid_read_error_test 00:18:12.007 ************************************ 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SsAnVhPfPZ 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3479846 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3479846 /var/tmp/spdk-raid.sock 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3479846 ']' 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:12.007 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:12.007 22:25:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.007 [2024-07-12 22:25:22.168843] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:18:12.007 [2024-07-12 22:25:22.168916] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3479846 ] 00:18:12.007 [2024-07-12 22:25:22.296942] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:12.273 [2024-07-12 22:25:22.399896] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:12.273 [2024-07-12 22:25:22.461020] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.273 [2024-07-12 22:25:22.461056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.846 22:25:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:12.846 22:25:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:12.846 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:12.846 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:13.104 BaseBdev1_malloc 00:18:13.104 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:13.363 true 00:18:13.363 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:13.622 [2024-07-12 22:25:23.826218] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:13.622 [2024-07-12 22:25:23.826265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.622 [2024-07-12 22:25:23.826288] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25960d0 00:18:13.622 [2024-07-12 22:25:23.826302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.622 [2024-07-12 22:25:23.828209] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.622 [2024-07-12 22:25:23.828238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:13.622 BaseBdev1 00:18:13.622 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:13.622 22:25:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:13.880 BaseBdev2_malloc 00:18:13.880 22:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:14.140 true 00:18:14.140 22:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:14.399 [2024-07-12 22:25:24.564818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:14.399 [2024-07-12 22:25:24.564861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.399 [2024-07-12 22:25:24.564882] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259a910 00:18:14.399 [2024-07-12 22:25:24.564895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.399 [2024-07-12 22:25:24.566475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.400 [2024-07-12 22:25:24.566502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:14.400 BaseBdev2 00:18:14.400 22:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:14.400 22:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:14.658 BaseBdev3_malloc 00:18:14.658 22:25:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:14.917 true 00:18:14.917 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:15.177 [2024-07-12 22:25:25.300596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:15.177 [2024-07-12 22:25:25.300643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:15.177 [2024-07-12 22:25:25.300665] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259cbd0 00:18:15.177 [2024-07-12 22:25:25.300678] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:15.177 [2024-07-12 22:25:25.302286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:15.177 [2024-07-12 22:25:25.302314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:15.177 BaseBdev3 00:18:15.177 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:15.437 [2024-07-12 22:25:25.533232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:15.437 [2024-07-12 22:25:25.534613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:15.437 [2024-07-12 22:25:25.534683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:15.437 [2024-07-12 22:25:25.534906] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x259e280 00:18:15.437 [2024-07-12 22:25:25.534918] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:15.437 [2024-07-12 22:25:25.535127] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259de20 00:18:15.437 [2024-07-12 22:25:25.535285] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x259e280 00:18:15.437 [2024-07-12 22:25:25.535296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x259e280 00:18:15.437 [2024-07-12 22:25:25.535405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.437 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.697 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.697 "name": "raid_bdev1", 00:18:15.697 "uuid": "14702098-bee5-4daf-9ff3-60de9c66255e", 00:18:15.697 "strip_size_kb": 0, 00:18:15.697 "state": "online", 00:18:15.697 "raid_level": "raid1", 00:18:15.697 "superblock": true, 00:18:15.697 "num_base_bdevs": 3, 00:18:15.697 "num_base_bdevs_discovered": 3, 00:18:15.697 "num_base_bdevs_operational": 3, 00:18:15.697 "base_bdevs_list": [ 00:18:15.697 { 00:18:15.697 "name": "BaseBdev1", 00:18:15.697 "uuid": "9e69b5f1-e8b2-54a8-ab19-09f487a8c321", 00:18:15.697 "is_configured": true, 00:18:15.697 "data_offset": 2048, 00:18:15.697 "data_size": 63488 00:18:15.697 }, 00:18:15.697 { 00:18:15.697 "name": "BaseBdev2", 00:18:15.697 "uuid": "13bee81c-3500-5712-9bcd-6379b303ffcc", 00:18:15.697 "is_configured": true, 00:18:15.697 "data_offset": 2048, 00:18:15.697 "data_size": 63488 00:18:15.697 }, 00:18:15.697 { 00:18:15.697 "name": "BaseBdev3", 00:18:15.697 "uuid": "9c87b7a2-452c-50f5-84fd-aef82af69d0a", 00:18:15.697 "is_configured": true, 00:18:15.697 "data_offset": 2048, 00:18:15.697 "data_size": 63488 00:18:15.697 } 00:18:15.697 ] 00:18:15.697 }' 00:18:15.697 22:25:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.697 22:25:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.265 22:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:16.265 22:25:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:16.265 [2024-07-12 22:25:26.463982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ebe00 00:18:17.204 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.463 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.723 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.723 "name": "raid_bdev1", 00:18:17.723 "uuid": "14702098-bee5-4daf-9ff3-60de9c66255e", 00:18:17.723 "strip_size_kb": 0, 00:18:17.723 "state": "online", 00:18:17.723 "raid_level": "raid1", 00:18:17.723 "superblock": true, 00:18:17.723 "num_base_bdevs": 3, 00:18:17.723 "num_base_bdevs_discovered": 3, 00:18:17.723 "num_base_bdevs_operational": 3, 00:18:17.723 "base_bdevs_list": [ 00:18:17.723 { 00:18:17.723 "name": "BaseBdev1", 00:18:17.723 "uuid": "9e69b5f1-e8b2-54a8-ab19-09f487a8c321", 00:18:17.723 "is_configured": true, 00:18:17.723 "data_offset": 2048, 00:18:17.723 "data_size": 63488 00:18:17.723 }, 00:18:17.723 { 00:18:17.723 "name": "BaseBdev2", 00:18:17.723 "uuid": "13bee81c-3500-5712-9bcd-6379b303ffcc", 00:18:17.723 "is_configured": true, 00:18:17.723 "data_offset": 2048, 00:18:17.723 "data_size": 63488 00:18:17.723 }, 00:18:17.723 { 00:18:17.723 "name": "BaseBdev3", 00:18:17.723 "uuid": "9c87b7a2-452c-50f5-84fd-aef82af69d0a", 00:18:17.723 "is_configured": true, 00:18:17.723 "data_offset": 2048, 00:18:17.723 "data_size": 63488 00:18:17.723 } 00:18:17.723 ] 00:18:17.723 }' 00:18:17.723 22:25:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.723 22:25:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.291 22:25:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:18.550 [2024-07-12 22:25:28.733855] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:18.551 [2024-07-12 22:25:28.733905] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.551 [2024-07-12 22:25:28.737050] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.551 [2024-07-12 22:25:28.737084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:18.551 [2024-07-12 22:25:28.737184] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.551 [2024-07-12 22:25:28.737196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x259e280 name raid_bdev1, state offline 00:18:18.551 0 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3479846 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3479846 ']' 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3479846 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3479846 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3479846' 00:18:18.551 killing process with pid 3479846 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3479846 00:18:18.551 [2024-07-12 22:25:28.803283] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:18.551 22:25:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3479846 00:18:18.551 [2024-07-12 22:25:28.824868] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:18.809 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SsAnVhPfPZ 00:18:18.809 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:18.809 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:18.809 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:18.809 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:18.810 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:18.810 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:18.810 22:25:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:18.810 00:18:18.810 real 0m6.970s 00:18:18.810 user 0m11.043s 00:18:18.810 sys 0m1.188s 00:18:18.810 22:25:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:18.810 22:25:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.810 ************************************ 00:18:18.810 END TEST raid_read_error_test 00:18:18.810 ************************************ 00:18:18.810 22:25:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:18.810 22:25:29 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:18.810 22:25:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:18.810 22:25:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:18.810 22:25:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:19.069 ************************************ 00:18:19.069 START TEST raid_write_error_test 00:18:19.069 ************************************ 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:19.069 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.M5M1hZgHLb 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3480826 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3480826 /var/tmp/spdk-raid.sock 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3480826 ']' 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:19.070 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:19.070 22:25:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.070 [2024-07-12 22:25:29.229559] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:18:19.070 [2024-07-12 22:25:29.229629] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3480826 ] 00:18:19.070 [2024-07-12 22:25:29.357019] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:19.329 [2024-07-12 22:25:29.459713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:19.329 [2024-07-12 22:25:29.523392] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.329 [2024-07-12 22:25:29.523454] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.896 22:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:19.896 22:25:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:19.896 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:19.896 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:20.155 BaseBdev1_malloc 00:18:20.155 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:20.470 true 00:18:20.470 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:20.470 [2024-07-12 22:25:30.764238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:20.470 [2024-07-12 22:25:30.764285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.470 [2024-07-12 22:25:30.764305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e4e0d0 00:18:20.470 [2024-07-12 22:25:30.764318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.470 [2024-07-12 22:25:30.766082] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.470 [2024-07-12 22:25:30.766111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:20.470 BaseBdev1 00:18:20.754 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:20.754 22:25:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:20.754 BaseBdev2_malloc 00:18:20.754 22:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:21.013 true 00:18:21.013 22:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:21.272 [2024-07-12 22:25:31.502737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:21.272 [2024-07-12 22:25:31.502783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.272 [2024-07-12 22:25:31.502803] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e52910 00:18:21.272 [2024-07-12 22:25:31.502816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.272 [2024-07-12 22:25:31.504293] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.272 [2024-07-12 22:25:31.504323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:21.272 BaseBdev2 00:18:21.272 22:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:21.272 22:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:21.532 BaseBdev3_malloc 00:18:21.532 22:25:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:21.791 true 00:18:21.791 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:22.050 [2024-07-12 22:25:32.241383] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:22.050 [2024-07-12 22:25:32.241429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.050 [2024-07-12 22:25:32.241448] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e54bd0 00:18:22.050 [2024-07-12 22:25:32.241461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.050 [2024-07-12 22:25:32.242862] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.050 [2024-07-12 22:25:32.242890] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:22.050 BaseBdev3 00:18:22.050 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:22.309 [2024-07-12 22:25:32.482052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.309 [2024-07-12 22:25:32.483319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:22.309 [2024-07-12 22:25:32.483387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:22.309 [2024-07-12 22:25:32.483596] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e56280 00:18:22.309 [2024-07-12 22:25:32.483609] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:22.309 [2024-07-12 22:25:32.483799] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e55e20 00:18:22.309 [2024-07-12 22:25:32.483960] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e56280 00:18:22.309 [2024-07-12 22:25:32.483971] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e56280 00:18:22.309 [2024-07-12 22:25:32.484070] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.309 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.569 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.569 "name": "raid_bdev1", 00:18:22.569 "uuid": "1a5f30cc-ac42-4460-b19e-e86fd162da48", 00:18:22.569 "strip_size_kb": 0, 00:18:22.569 "state": "online", 00:18:22.569 "raid_level": "raid1", 00:18:22.569 "superblock": true, 00:18:22.569 "num_base_bdevs": 3, 00:18:22.569 "num_base_bdevs_discovered": 3, 00:18:22.569 "num_base_bdevs_operational": 3, 00:18:22.569 "base_bdevs_list": [ 00:18:22.569 { 00:18:22.569 "name": "BaseBdev1", 00:18:22.569 "uuid": "b3ae0ef3-868c-5b98-a421-c0213f9c5d0a", 00:18:22.569 "is_configured": true, 00:18:22.569 "data_offset": 2048, 00:18:22.569 "data_size": 63488 00:18:22.569 }, 00:18:22.569 { 00:18:22.569 "name": "BaseBdev2", 00:18:22.569 "uuid": "15738315-0562-5451-9eb9-a2821995b807", 00:18:22.569 "is_configured": true, 00:18:22.569 "data_offset": 2048, 00:18:22.569 "data_size": 63488 00:18:22.569 }, 00:18:22.569 { 00:18:22.569 "name": "BaseBdev3", 00:18:22.569 "uuid": "8f4d1549-647a-598c-8bfc-45ce0d02c2b4", 00:18:22.569 "is_configured": true, 00:18:22.569 "data_offset": 2048, 00:18:22.569 "data_size": 63488 00:18:22.569 } 00:18:22.569 ] 00:18:22.569 }' 00:18:22.569 22:25:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.569 22:25:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.136 22:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:23.136 22:25:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:23.395 [2024-07-12 22:25:33.468968] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca3e00 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:24.334 [2024-07-12 22:25:34.589690] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:24.334 [2024-07-12 22:25:34.589750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:24.334 [2024-07-12 22:25:34.589958] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1ca3e00 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.334 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:24.593 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.593 "name": "raid_bdev1", 00:18:24.593 "uuid": "1a5f30cc-ac42-4460-b19e-e86fd162da48", 00:18:24.593 "strip_size_kb": 0, 00:18:24.593 "state": "online", 00:18:24.593 "raid_level": "raid1", 00:18:24.593 "superblock": true, 00:18:24.593 "num_base_bdevs": 3, 00:18:24.593 "num_base_bdevs_discovered": 2, 00:18:24.593 "num_base_bdevs_operational": 2, 00:18:24.593 "base_bdevs_list": [ 00:18:24.593 { 00:18:24.593 "name": null, 00:18:24.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.593 "is_configured": false, 00:18:24.593 "data_offset": 2048, 00:18:24.593 "data_size": 63488 00:18:24.593 }, 00:18:24.593 { 00:18:24.593 "name": "BaseBdev2", 00:18:24.593 "uuid": "15738315-0562-5451-9eb9-a2821995b807", 00:18:24.593 "is_configured": true, 00:18:24.593 "data_offset": 2048, 00:18:24.593 "data_size": 63488 00:18:24.593 }, 00:18:24.593 { 00:18:24.593 "name": "BaseBdev3", 00:18:24.593 "uuid": "8f4d1549-647a-598c-8bfc-45ce0d02c2b4", 00:18:24.593 "is_configured": true, 00:18:24.593 "data_offset": 2048, 00:18:24.593 "data_size": 63488 00:18:24.593 } 00:18:24.593 ] 00:18:24.593 }' 00:18:24.593 22:25:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.593 22:25:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:25.530 [2024-07-12 22:25:35.734013] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:25.530 [2024-07-12 22:25:35.734058] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:25.530 [2024-07-12 22:25:35.737227] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:25.530 [2024-07-12 22:25:35.737257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:25.530 [2024-07-12 22:25:35.737333] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:25.530 [2024-07-12 22:25:35.737351] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e56280 name raid_bdev1, state offline 00:18:25.530 0 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3480826 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3480826 ']' 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3480826 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3480826 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3480826' 00:18:25.530 killing process with pid 3480826 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3480826 00:18:25.530 [2024-07-12 22:25:35.800042] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:25.530 22:25:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3480826 00:18:25.530 [2024-07-12 22:25:35.821430] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.M5M1hZgHLb 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:25.789 00:18:25.789 real 0m6.912s 00:18:25.789 user 0m10.875s 00:18:25.789 sys 0m1.278s 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:25.789 22:25:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.789 ************************************ 00:18:25.789 END TEST raid_write_error_test 00:18:25.789 ************************************ 00:18:25.789 22:25:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:25.789 22:25:36 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:18:25.789 22:25:36 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:25.789 22:25:36 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:25.789 22:25:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:25.789 22:25:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:25.789 22:25:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:26.049 ************************************ 00:18:26.049 START TEST raid_state_function_test 00:18:26.049 ************************************ 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3481809 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3481809' 00:18:26.049 Process raid pid: 3481809 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3481809 /var/tmp/spdk-raid.sock 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3481809 ']' 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:26.049 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:26.049 22:25:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.049 [2024-07-12 22:25:36.223296] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:18:26.049 [2024-07-12 22:25:36.223366] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:26.049 [2024-07-12 22:25:36.353969] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:26.308 [2024-07-12 22:25:36.455755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:26.308 [2024-07-12 22:25:36.522532] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.308 [2024-07-12 22:25:36.522571] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.876 22:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:26.876 22:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:26.876 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.135 [2024-07-12 22:25:37.386267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:27.135 [2024-07-12 22:25:37.386312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:27.135 [2024-07-12 22:25:37.386323] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.135 [2024-07-12 22:25:37.386335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.135 [2024-07-12 22:25:37.386344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.135 [2024-07-12 22:25:37.386355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.135 [2024-07-12 22:25:37.386364] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.135 [2024-07-12 22:25:37.386375] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.135 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.395 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.395 "name": "Existed_Raid", 00:18:27.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.395 "strip_size_kb": 64, 00:18:27.395 "state": "configuring", 00:18:27.395 "raid_level": "raid0", 00:18:27.395 "superblock": false, 00:18:27.395 "num_base_bdevs": 4, 00:18:27.395 "num_base_bdevs_discovered": 0, 00:18:27.395 "num_base_bdevs_operational": 4, 00:18:27.395 "base_bdevs_list": [ 00:18:27.395 { 00:18:27.395 "name": "BaseBdev1", 00:18:27.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.395 "is_configured": false, 00:18:27.395 "data_offset": 0, 00:18:27.395 "data_size": 0 00:18:27.395 }, 00:18:27.395 { 00:18:27.395 "name": "BaseBdev2", 00:18:27.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.395 "is_configured": false, 00:18:27.395 "data_offset": 0, 00:18:27.395 "data_size": 0 00:18:27.395 }, 00:18:27.395 { 00:18:27.395 "name": "BaseBdev3", 00:18:27.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.395 "is_configured": false, 00:18:27.395 "data_offset": 0, 00:18:27.395 "data_size": 0 00:18:27.395 }, 00:18:27.395 { 00:18:27.395 "name": "BaseBdev4", 00:18:27.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.395 "is_configured": false, 00:18:27.395 "data_offset": 0, 00:18:27.395 "data_size": 0 00:18:27.395 } 00:18:27.395 ] 00:18:27.395 }' 00:18:27.395 22:25:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.395 22:25:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:27.962 22:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:28.221 [2024-07-12 22:25:38.320602] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:28.221 [2024-07-12 22:25:38.320635] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21afaa0 name Existed_Raid, state configuring 00:18:28.221 22:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:28.480 [2024-07-12 22:25:38.561267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:28.480 [2024-07-12 22:25:38.561302] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:28.480 [2024-07-12 22:25:38.561312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:28.480 [2024-07-12 22:25:38.561323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:28.480 [2024-07-12 22:25:38.561332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:28.480 [2024-07-12 22:25:38.561343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:28.480 [2024-07-12 22:25:38.561351] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:28.480 [2024-07-12 22:25:38.561362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:28.480 22:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:28.739 [2024-07-12 22:25:38.813047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:28.739 BaseBdev1 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:28.739 22:25:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:28.998 [ 00:18:28.998 { 00:18:28.998 "name": "BaseBdev1", 00:18:28.998 "aliases": [ 00:18:28.998 "4df05016-88a9-46d0-b57e-21e8f8bb0586" 00:18:28.998 ], 00:18:28.998 "product_name": "Malloc disk", 00:18:28.998 "block_size": 512, 00:18:28.998 "num_blocks": 65536, 00:18:28.998 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:28.998 "assigned_rate_limits": { 00:18:28.998 "rw_ios_per_sec": 0, 00:18:28.998 "rw_mbytes_per_sec": 0, 00:18:28.998 "r_mbytes_per_sec": 0, 00:18:28.998 "w_mbytes_per_sec": 0 00:18:28.998 }, 00:18:28.998 "claimed": true, 00:18:28.998 "claim_type": "exclusive_write", 00:18:28.998 "zoned": false, 00:18:28.998 "supported_io_types": { 00:18:28.998 "read": true, 00:18:28.998 "write": true, 00:18:28.998 "unmap": true, 00:18:28.998 "flush": true, 00:18:28.998 "reset": true, 00:18:28.998 "nvme_admin": false, 00:18:28.998 "nvme_io": false, 00:18:28.998 "nvme_io_md": false, 00:18:28.998 "write_zeroes": true, 00:18:28.998 "zcopy": true, 00:18:28.998 "get_zone_info": false, 00:18:28.998 "zone_management": false, 00:18:28.998 "zone_append": false, 00:18:28.998 "compare": false, 00:18:28.998 "compare_and_write": false, 00:18:28.998 "abort": true, 00:18:28.998 "seek_hole": false, 00:18:28.998 "seek_data": false, 00:18:28.998 "copy": true, 00:18:28.998 "nvme_iov_md": false 00:18:28.998 }, 00:18:28.998 "memory_domains": [ 00:18:28.998 { 00:18:28.998 "dma_device_id": "system", 00:18:28.998 "dma_device_type": 1 00:18:28.998 }, 00:18:28.998 { 00:18:28.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.998 "dma_device_type": 2 00:18:28.998 } 00:18:28.998 ], 00:18:28.998 "driver_specific": {} 00:18:28.998 } 00:18:28.998 ] 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.998 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.257 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.257 "name": "Existed_Raid", 00:18:29.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.257 "strip_size_kb": 64, 00:18:29.257 "state": "configuring", 00:18:29.257 "raid_level": "raid0", 00:18:29.257 "superblock": false, 00:18:29.257 "num_base_bdevs": 4, 00:18:29.257 "num_base_bdevs_discovered": 1, 00:18:29.257 "num_base_bdevs_operational": 4, 00:18:29.257 "base_bdevs_list": [ 00:18:29.257 { 00:18:29.257 "name": "BaseBdev1", 00:18:29.257 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:29.257 "is_configured": true, 00:18:29.257 "data_offset": 0, 00:18:29.257 "data_size": 65536 00:18:29.257 }, 00:18:29.257 { 00:18:29.257 "name": "BaseBdev2", 00:18:29.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.257 "is_configured": false, 00:18:29.257 "data_offset": 0, 00:18:29.257 "data_size": 0 00:18:29.257 }, 00:18:29.257 { 00:18:29.257 "name": "BaseBdev3", 00:18:29.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.257 "is_configured": false, 00:18:29.257 "data_offset": 0, 00:18:29.257 "data_size": 0 00:18:29.257 }, 00:18:29.257 { 00:18:29.257 "name": "BaseBdev4", 00:18:29.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.257 "is_configured": false, 00:18:29.257 "data_offset": 0, 00:18:29.257 "data_size": 0 00:18:29.257 } 00:18:29.257 ] 00:18:29.257 }' 00:18:29.257 22:25:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.257 22:25:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.826 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:30.084 [2024-07-12 22:25:40.337097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:30.084 [2024-07-12 22:25:40.337142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21af310 name Existed_Raid, state configuring 00:18:30.084 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:30.343 [2024-07-12 22:25:40.509580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:30.343 [2024-07-12 22:25:40.511026] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:30.343 [2024-07-12 22:25:40.511058] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:30.343 [2024-07-12 22:25:40.511069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:30.343 [2024-07-12 22:25:40.511081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:30.343 [2024-07-12 22:25:40.511091] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:30.343 [2024-07-12 22:25:40.511109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.343 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.603 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.603 "name": "Existed_Raid", 00:18:30.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.603 "strip_size_kb": 64, 00:18:30.603 "state": "configuring", 00:18:30.603 "raid_level": "raid0", 00:18:30.603 "superblock": false, 00:18:30.603 "num_base_bdevs": 4, 00:18:30.603 "num_base_bdevs_discovered": 1, 00:18:30.603 "num_base_bdevs_operational": 4, 00:18:30.603 "base_bdevs_list": [ 00:18:30.603 { 00:18:30.603 "name": "BaseBdev1", 00:18:30.603 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:30.603 "is_configured": true, 00:18:30.603 "data_offset": 0, 00:18:30.603 "data_size": 65536 00:18:30.603 }, 00:18:30.603 { 00:18:30.603 "name": "BaseBdev2", 00:18:30.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.603 "is_configured": false, 00:18:30.603 "data_offset": 0, 00:18:30.603 "data_size": 0 00:18:30.603 }, 00:18:30.603 { 00:18:30.603 "name": "BaseBdev3", 00:18:30.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.603 "is_configured": false, 00:18:30.603 "data_offset": 0, 00:18:30.603 "data_size": 0 00:18:30.603 }, 00:18:30.603 { 00:18:30.603 "name": "BaseBdev4", 00:18:30.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.603 "is_configured": false, 00:18:30.603 "data_offset": 0, 00:18:30.603 "data_size": 0 00:18:30.603 } 00:18:30.603 ] 00:18:30.603 }' 00:18:30.603 22:25:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.603 22:25:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.170 22:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:31.429 [2024-07-12 22:25:41.624034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:31.429 BaseBdev2 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:31.429 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:31.687 22:25:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:31.946 [ 00:18:31.946 { 00:18:31.946 "name": "BaseBdev2", 00:18:31.946 "aliases": [ 00:18:31.946 "11e54683-b152-48b2-8cce-1a2a6b32b711" 00:18:31.946 ], 00:18:31.946 "product_name": "Malloc disk", 00:18:31.946 "block_size": 512, 00:18:31.946 "num_blocks": 65536, 00:18:31.946 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:31.946 "assigned_rate_limits": { 00:18:31.946 "rw_ios_per_sec": 0, 00:18:31.946 "rw_mbytes_per_sec": 0, 00:18:31.946 "r_mbytes_per_sec": 0, 00:18:31.946 "w_mbytes_per_sec": 0 00:18:31.946 }, 00:18:31.946 "claimed": true, 00:18:31.946 "claim_type": "exclusive_write", 00:18:31.946 "zoned": false, 00:18:31.946 "supported_io_types": { 00:18:31.946 "read": true, 00:18:31.946 "write": true, 00:18:31.946 "unmap": true, 00:18:31.946 "flush": true, 00:18:31.946 "reset": true, 00:18:31.946 "nvme_admin": false, 00:18:31.946 "nvme_io": false, 00:18:31.946 "nvme_io_md": false, 00:18:31.946 "write_zeroes": true, 00:18:31.946 "zcopy": true, 00:18:31.946 "get_zone_info": false, 00:18:31.946 "zone_management": false, 00:18:31.947 "zone_append": false, 00:18:31.947 "compare": false, 00:18:31.947 "compare_and_write": false, 00:18:31.947 "abort": true, 00:18:31.947 "seek_hole": false, 00:18:31.947 "seek_data": false, 00:18:31.947 "copy": true, 00:18:31.947 "nvme_iov_md": false 00:18:31.947 }, 00:18:31.947 "memory_domains": [ 00:18:31.947 { 00:18:31.947 "dma_device_id": "system", 00:18:31.947 "dma_device_type": 1 00:18:31.947 }, 00:18:31.947 { 00:18:31.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.947 "dma_device_type": 2 00:18:31.947 } 00:18:31.947 ], 00:18:31.947 "driver_specific": {} 00:18:31.947 } 00:18:31.947 ] 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.947 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.206 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.206 "name": "Existed_Raid", 00:18:32.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.206 "strip_size_kb": 64, 00:18:32.206 "state": "configuring", 00:18:32.206 "raid_level": "raid0", 00:18:32.206 "superblock": false, 00:18:32.206 "num_base_bdevs": 4, 00:18:32.206 "num_base_bdevs_discovered": 2, 00:18:32.206 "num_base_bdevs_operational": 4, 00:18:32.206 "base_bdevs_list": [ 00:18:32.206 { 00:18:32.206 "name": "BaseBdev1", 00:18:32.206 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:32.206 "is_configured": true, 00:18:32.206 "data_offset": 0, 00:18:32.206 "data_size": 65536 00:18:32.206 }, 00:18:32.206 { 00:18:32.206 "name": "BaseBdev2", 00:18:32.206 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:32.206 "is_configured": true, 00:18:32.206 "data_offset": 0, 00:18:32.206 "data_size": 65536 00:18:32.206 }, 00:18:32.206 { 00:18:32.206 "name": "BaseBdev3", 00:18:32.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.206 "is_configured": false, 00:18:32.206 "data_offset": 0, 00:18:32.206 "data_size": 0 00:18:32.206 }, 00:18:32.206 { 00:18:32.206 "name": "BaseBdev4", 00:18:32.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.206 "is_configured": false, 00:18:32.206 "data_offset": 0, 00:18:32.206 "data_size": 0 00:18:32.206 } 00:18:32.206 ] 00:18:32.206 }' 00:18:32.206 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.206 22:25:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.773 22:25:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:33.032 [2024-07-12 22:25:43.127393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:33.032 BaseBdev3 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:33.032 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:33.291 [ 00:18:33.291 { 00:18:33.291 "name": "BaseBdev3", 00:18:33.291 "aliases": [ 00:18:33.291 "26813f22-9d40-4e1e-bf83-6ce1933bef6f" 00:18:33.291 ], 00:18:33.291 "product_name": "Malloc disk", 00:18:33.291 "block_size": 512, 00:18:33.291 "num_blocks": 65536, 00:18:33.291 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:33.291 "assigned_rate_limits": { 00:18:33.291 "rw_ios_per_sec": 0, 00:18:33.291 "rw_mbytes_per_sec": 0, 00:18:33.291 "r_mbytes_per_sec": 0, 00:18:33.291 "w_mbytes_per_sec": 0 00:18:33.291 }, 00:18:33.291 "claimed": true, 00:18:33.291 "claim_type": "exclusive_write", 00:18:33.291 "zoned": false, 00:18:33.291 "supported_io_types": { 00:18:33.291 "read": true, 00:18:33.291 "write": true, 00:18:33.291 "unmap": true, 00:18:33.291 "flush": true, 00:18:33.291 "reset": true, 00:18:33.291 "nvme_admin": false, 00:18:33.291 "nvme_io": false, 00:18:33.291 "nvme_io_md": false, 00:18:33.291 "write_zeroes": true, 00:18:33.291 "zcopy": true, 00:18:33.291 "get_zone_info": false, 00:18:33.291 "zone_management": false, 00:18:33.291 "zone_append": false, 00:18:33.291 "compare": false, 00:18:33.291 "compare_and_write": false, 00:18:33.291 "abort": true, 00:18:33.291 "seek_hole": false, 00:18:33.291 "seek_data": false, 00:18:33.291 "copy": true, 00:18:33.291 "nvme_iov_md": false 00:18:33.291 }, 00:18:33.291 "memory_domains": [ 00:18:33.291 { 00:18:33.291 "dma_device_id": "system", 00:18:33.291 "dma_device_type": 1 00:18:33.291 }, 00:18:33.291 { 00:18:33.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.291 "dma_device_type": 2 00:18:33.291 } 00:18:33.291 ], 00:18:33.291 "driver_specific": {} 00:18:33.291 } 00:18:33.291 ] 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.291 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.550 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.550 "name": "Existed_Raid", 00:18:33.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.550 "strip_size_kb": 64, 00:18:33.550 "state": "configuring", 00:18:33.550 "raid_level": "raid0", 00:18:33.550 "superblock": false, 00:18:33.550 "num_base_bdevs": 4, 00:18:33.550 "num_base_bdevs_discovered": 3, 00:18:33.550 "num_base_bdevs_operational": 4, 00:18:33.550 "base_bdevs_list": [ 00:18:33.550 { 00:18:33.550 "name": "BaseBdev1", 00:18:33.550 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:33.550 "is_configured": true, 00:18:33.550 "data_offset": 0, 00:18:33.550 "data_size": 65536 00:18:33.550 }, 00:18:33.550 { 00:18:33.550 "name": "BaseBdev2", 00:18:33.550 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:33.550 "is_configured": true, 00:18:33.550 "data_offset": 0, 00:18:33.550 "data_size": 65536 00:18:33.550 }, 00:18:33.550 { 00:18:33.550 "name": "BaseBdev3", 00:18:33.550 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:33.550 "is_configured": true, 00:18:33.550 "data_offset": 0, 00:18:33.550 "data_size": 65536 00:18:33.550 }, 00:18:33.550 { 00:18:33.550 "name": "BaseBdev4", 00:18:33.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.550 "is_configured": false, 00:18:33.550 "data_offset": 0, 00:18:33.550 "data_size": 0 00:18:33.550 } 00:18:33.550 ] 00:18:33.550 }' 00:18:33.550 22:25:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.550 22:25:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:34.117 [2024-07-12 22:25:44.366145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:34.117 [2024-07-12 22:25:44.366185] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21b0350 00:18:34.117 [2024-07-12 22:25:44.366194] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:34.117 [2024-07-12 22:25:44.366453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b0020 00:18:34.117 [2024-07-12 22:25:44.366576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21b0350 00:18:34.117 [2024-07-12 22:25:44.366586] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21b0350 00:18:34.117 [2024-07-12 22:25:44.366750] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:34.117 BaseBdev4 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:34.117 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.375 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:34.634 [ 00:18:34.634 { 00:18:34.634 "name": "BaseBdev4", 00:18:34.634 "aliases": [ 00:18:34.634 "48498315-448a-4341-aed6-2c94db55201c" 00:18:34.634 ], 00:18:34.634 "product_name": "Malloc disk", 00:18:34.634 "block_size": 512, 00:18:34.634 "num_blocks": 65536, 00:18:34.634 "uuid": "48498315-448a-4341-aed6-2c94db55201c", 00:18:34.634 "assigned_rate_limits": { 00:18:34.634 "rw_ios_per_sec": 0, 00:18:34.634 "rw_mbytes_per_sec": 0, 00:18:34.634 "r_mbytes_per_sec": 0, 00:18:34.634 "w_mbytes_per_sec": 0 00:18:34.634 }, 00:18:34.634 "claimed": true, 00:18:34.634 "claim_type": "exclusive_write", 00:18:34.634 "zoned": false, 00:18:34.634 "supported_io_types": { 00:18:34.634 "read": true, 00:18:34.634 "write": true, 00:18:34.634 "unmap": true, 00:18:34.634 "flush": true, 00:18:34.634 "reset": true, 00:18:34.634 "nvme_admin": false, 00:18:34.634 "nvme_io": false, 00:18:34.634 "nvme_io_md": false, 00:18:34.634 "write_zeroes": true, 00:18:34.634 "zcopy": true, 00:18:34.634 "get_zone_info": false, 00:18:34.634 "zone_management": false, 00:18:34.634 "zone_append": false, 00:18:34.634 "compare": false, 00:18:34.634 "compare_and_write": false, 00:18:34.634 "abort": true, 00:18:34.634 "seek_hole": false, 00:18:34.634 "seek_data": false, 00:18:34.634 "copy": true, 00:18:34.634 "nvme_iov_md": false 00:18:34.634 }, 00:18:34.634 "memory_domains": [ 00:18:34.634 { 00:18:34.634 "dma_device_id": "system", 00:18:34.634 "dma_device_type": 1 00:18:34.634 }, 00:18:34.634 { 00:18:34.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.634 "dma_device_type": 2 00:18:34.634 } 00:18:34.634 ], 00:18:34.634 "driver_specific": {} 00:18:34.634 } 00:18:34.634 ] 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.634 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.635 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.635 22:25:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:34.894 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.894 "name": "Existed_Raid", 00:18:34.894 "uuid": "fe6d57a8-b701-4a8e-baff-ef431132dc02", 00:18:34.894 "strip_size_kb": 64, 00:18:34.894 "state": "online", 00:18:34.894 "raid_level": "raid0", 00:18:34.894 "superblock": false, 00:18:34.894 "num_base_bdevs": 4, 00:18:34.894 "num_base_bdevs_discovered": 4, 00:18:34.894 "num_base_bdevs_operational": 4, 00:18:34.894 "base_bdevs_list": [ 00:18:34.894 { 00:18:34.894 "name": "BaseBdev1", 00:18:34.894 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:34.894 "is_configured": true, 00:18:34.894 "data_offset": 0, 00:18:34.894 "data_size": 65536 00:18:34.894 }, 00:18:34.894 { 00:18:34.894 "name": "BaseBdev2", 00:18:34.894 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:34.894 "is_configured": true, 00:18:34.894 "data_offset": 0, 00:18:34.894 "data_size": 65536 00:18:34.894 }, 00:18:34.894 { 00:18:34.894 "name": "BaseBdev3", 00:18:34.894 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:34.894 "is_configured": true, 00:18:34.894 "data_offset": 0, 00:18:34.894 "data_size": 65536 00:18:34.894 }, 00:18:34.894 { 00:18:34.894 "name": "BaseBdev4", 00:18:34.894 "uuid": "48498315-448a-4341-aed6-2c94db55201c", 00:18:34.894 "is_configured": true, 00:18:34.894 "data_offset": 0, 00:18:34.894 "data_size": 65536 00:18:34.894 } 00:18:34.894 ] 00:18:34.894 }' 00:18:34.894 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.894 22:25:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:35.461 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:35.720 [2024-07-12 22:25:45.810306] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:35.720 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:35.720 "name": "Existed_Raid", 00:18:35.720 "aliases": [ 00:18:35.720 "fe6d57a8-b701-4a8e-baff-ef431132dc02" 00:18:35.720 ], 00:18:35.720 "product_name": "Raid Volume", 00:18:35.720 "block_size": 512, 00:18:35.720 "num_blocks": 262144, 00:18:35.720 "uuid": "fe6d57a8-b701-4a8e-baff-ef431132dc02", 00:18:35.720 "assigned_rate_limits": { 00:18:35.720 "rw_ios_per_sec": 0, 00:18:35.720 "rw_mbytes_per_sec": 0, 00:18:35.720 "r_mbytes_per_sec": 0, 00:18:35.720 "w_mbytes_per_sec": 0 00:18:35.720 }, 00:18:35.720 "claimed": false, 00:18:35.720 "zoned": false, 00:18:35.720 "supported_io_types": { 00:18:35.720 "read": true, 00:18:35.720 "write": true, 00:18:35.720 "unmap": true, 00:18:35.720 "flush": true, 00:18:35.720 "reset": true, 00:18:35.720 "nvme_admin": false, 00:18:35.720 "nvme_io": false, 00:18:35.720 "nvme_io_md": false, 00:18:35.720 "write_zeroes": true, 00:18:35.720 "zcopy": false, 00:18:35.720 "get_zone_info": false, 00:18:35.720 "zone_management": false, 00:18:35.720 "zone_append": false, 00:18:35.720 "compare": false, 00:18:35.720 "compare_and_write": false, 00:18:35.720 "abort": false, 00:18:35.720 "seek_hole": false, 00:18:35.720 "seek_data": false, 00:18:35.720 "copy": false, 00:18:35.720 "nvme_iov_md": false 00:18:35.720 }, 00:18:35.720 "memory_domains": [ 00:18:35.720 { 00:18:35.720 "dma_device_id": "system", 00:18:35.720 "dma_device_type": 1 00:18:35.720 }, 00:18:35.720 { 00:18:35.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.720 "dma_device_type": 2 00:18:35.720 }, 00:18:35.720 { 00:18:35.720 "dma_device_id": "system", 00:18:35.720 "dma_device_type": 1 00:18:35.720 }, 00:18:35.720 { 00:18:35.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.720 "dma_device_type": 2 00:18:35.720 }, 00:18:35.720 { 00:18:35.720 "dma_device_id": "system", 00:18:35.720 "dma_device_type": 1 00:18:35.720 }, 00:18:35.720 { 00:18:35.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.721 "dma_device_type": 2 00:18:35.721 }, 00:18:35.721 { 00:18:35.721 "dma_device_id": "system", 00:18:35.721 "dma_device_type": 1 00:18:35.721 }, 00:18:35.721 { 00:18:35.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.721 "dma_device_type": 2 00:18:35.721 } 00:18:35.721 ], 00:18:35.721 "driver_specific": { 00:18:35.721 "raid": { 00:18:35.721 "uuid": "fe6d57a8-b701-4a8e-baff-ef431132dc02", 00:18:35.721 "strip_size_kb": 64, 00:18:35.721 "state": "online", 00:18:35.721 "raid_level": "raid0", 00:18:35.721 "superblock": false, 00:18:35.721 "num_base_bdevs": 4, 00:18:35.721 "num_base_bdevs_discovered": 4, 00:18:35.721 "num_base_bdevs_operational": 4, 00:18:35.721 "base_bdevs_list": [ 00:18:35.721 { 00:18:35.721 "name": "BaseBdev1", 00:18:35.721 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:35.721 "is_configured": true, 00:18:35.721 "data_offset": 0, 00:18:35.721 "data_size": 65536 00:18:35.721 }, 00:18:35.721 { 00:18:35.721 "name": "BaseBdev2", 00:18:35.721 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:35.721 "is_configured": true, 00:18:35.721 "data_offset": 0, 00:18:35.721 "data_size": 65536 00:18:35.721 }, 00:18:35.721 { 00:18:35.721 "name": "BaseBdev3", 00:18:35.721 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:35.721 "is_configured": true, 00:18:35.721 "data_offset": 0, 00:18:35.721 "data_size": 65536 00:18:35.721 }, 00:18:35.721 { 00:18:35.721 "name": "BaseBdev4", 00:18:35.721 "uuid": "48498315-448a-4341-aed6-2c94db55201c", 00:18:35.721 "is_configured": true, 00:18:35.721 "data_offset": 0, 00:18:35.721 "data_size": 65536 00:18:35.721 } 00:18:35.721 ] 00:18:35.721 } 00:18:35.721 } 00:18:35.721 }' 00:18:35.721 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:35.721 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:35.721 BaseBdev2 00:18:35.721 BaseBdev3 00:18:35.721 BaseBdev4' 00:18:35.721 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.721 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:35.721 22:25:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.980 "name": "BaseBdev1", 00:18:35.980 "aliases": [ 00:18:35.980 "4df05016-88a9-46d0-b57e-21e8f8bb0586" 00:18:35.980 ], 00:18:35.980 "product_name": "Malloc disk", 00:18:35.980 "block_size": 512, 00:18:35.980 "num_blocks": 65536, 00:18:35.980 "uuid": "4df05016-88a9-46d0-b57e-21e8f8bb0586", 00:18:35.980 "assigned_rate_limits": { 00:18:35.980 "rw_ios_per_sec": 0, 00:18:35.980 "rw_mbytes_per_sec": 0, 00:18:35.980 "r_mbytes_per_sec": 0, 00:18:35.980 "w_mbytes_per_sec": 0 00:18:35.980 }, 00:18:35.980 "claimed": true, 00:18:35.980 "claim_type": "exclusive_write", 00:18:35.980 "zoned": false, 00:18:35.980 "supported_io_types": { 00:18:35.980 "read": true, 00:18:35.980 "write": true, 00:18:35.980 "unmap": true, 00:18:35.980 "flush": true, 00:18:35.980 "reset": true, 00:18:35.980 "nvme_admin": false, 00:18:35.980 "nvme_io": false, 00:18:35.980 "nvme_io_md": false, 00:18:35.980 "write_zeroes": true, 00:18:35.980 "zcopy": true, 00:18:35.980 "get_zone_info": false, 00:18:35.980 "zone_management": false, 00:18:35.980 "zone_append": false, 00:18:35.980 "compare": false, 00:18:35.980 "compare_and_write": false, 00:18:35.980 "abort": true, 00:18:35.980 "seek_hole": false, 00:18:35.980 "seek_data": false, 00:18:35.980 "copy": true, 00:18:35.980 "nvme_iov_md": false 00:18:35.980 }, 00:18:35.980 "memory_domains": [ 00:18:35.980 { 00:18:35.980 "dma_device_id": "system", 00:18:35.980 "dma_device_type": 1 00:18:35.980 }, 00:18:35.980 { 00:18:35.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.980 "dma_device_type": 2 00:18:35.980 } 00:18:35.980 ], 00:18:35.980 "driver_specific": {} 00:18:35.980 }' 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.980 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:36.239 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:36.808 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:36.808 "name": "BaseBdev2", 00:18:36.808 "aliases": [ 00:18:36.808 "11e54683-b152-48b2-8cce-1a2a6b32b711" 00:18:36.808 ], 00:18:36.808 "product_name": "Malloc disk", 00:18:36.808 "block_size": 512, 00:18:36.808 "num_blocks": 65536, 00:18:36.808 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:36.808 "assigned_rate_limits": { 00:18:36.808 "rw_ios_per_sec": 0, 00:18:36.808 "rw_mbytes_per_sec": 0, 00:18:36.808 "r_mbytes_per_sec": 0, 00:18:36.808 "w_mbytes_per_sec": 0 00:18:36.808 }, 00:18:36.808 "claimed": true, 00:18:36.808 "claim_type": "exclusive_write", 00:18:36.808 "zoned": false, 00:18:36.808 "supported_io_types": { 00:18:36.808 "read": true, 00:18:36.808 "write": true, 00:18:36.808 "unmap": true, 00:18:36.808 "flush": true, 00:18:36.808 "reset": true, 00:18:36.808 "nvme_admin": false, 00:18:36.808 "nvme_io": false, 00:18:36.808 "nvme_io_md": false, 00:18:36.808 "write_zeroes": true, 00:18:36.808 "zcopy": true, 00:18:36.808 "get_zone_info": false, 00:18:36.808 "zone_management": false, 00:18:36.808 "zone_append": false, 00:18:36.808 "compare": false, 00:18:36.808 "compare_and_write": false, 00:18:36.808 "abort": true, 00:18:36.808 "seek_hole": false, 00:18:36.808 "seek_data": false, 00:18:36.808 "copy": true, 00:18:36.808 "nvme_iov_md": false 00:18:36.808 }, 00:18:36.808 "memory_domains": [ 00:18:36.808 { 00:18:36.808 "dma_device_id": "system", 00:18:36.808 "dma_device_type": 1 00:18:36.808 }, 00:18:36.808 { 00:18:36.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.808 "dma_device_type": 2 00:18:36.808 } 00:18:36.808 ], 00:18:36.808 "driver_specific": {} 00:18:36.808 }' 00:18:36.808 22:25:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.808 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.808 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:36.808 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.808 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:37.067 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.326 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.326 "name": "BaseBdev3", 00:18:37.326 "aliases": [ 00:18:37.326 "26813f22-9d40-4e1e-bf83-6ce1933bef6f" 00:18:37.326 ], 00:18:37.326 "product_name": "Malloc disk", 00:18:37.326 "block_size": 512, 00:18:37.326 "num_blocks": 65536, 00:18:37.326 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:37.326 "assigned_rate_limits": { 00:18:37.326 "rw_ios_per_sec": 0, 00:18:37.326 "rw_mbytes_per_sec": 0, 00:18:37.326 "r_mbytes_per_sec": 0, 00:18:37.326 "w_mbytes_per_sec": 0 00:18:37.326 }, 00:18:37.326 "claimed": true, 00:18:37.326 "claim_type": "exclusive_write", 00:18:37.326 "zoned": false, 00:18:37.326 "supported_io_types": { 00:18:37.326 "read": true, 00:18:37.326 "write": true, 00:18:37.326 "unmap": true, 00:18:37.326 "flush": true, 00:18:37.326 "reset": true, 00:18:37.326 "nvme_admin": false, 00:18:37.326 "nvme_io": false, 00:18:37.326 "nvme_io_md": false, 00:18:37.326 "write_zeroes": true, 00:18:37.326 "zcopy": true, 00:18:37.326 "get_zone_info": false, 00:18:37.326 "zone_management": false, 00:18:37.326 "zone_append": false, 00:18:37.326 "compare": false, 00:18:37.326 "compare_and_write": false, 00:18:37.326 "abort": true, 00:18:37.326 "seek_hole": false, 00:18:37.326 "seek_data": false, 00:18:37.326 "copy": true, 00:18:37.326 "nvme_iov_md": false 00:18:37.326 }, 00:18:37.326 "memory_domains": [ 00:18:37.326 { 00:18:37.326 "dma_device_id": "system", 00:18:37.326 "dma_device_type": 1 00:18:37.326 }, 00:18:37.326 { 00:18:37.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.326 "dma_device_type": 2 00:18:37.326 } 00:18:37.326 ], 00:18:37.326 "driver_specific": {} 00:18:37.326 }' 00:18:37.326 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.585 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:37.844 22:25:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.103 "name": "BaseBdev4", 00:18:38.103 "aliases": [ 00:18:38.103 "48498315-448a-4341-aed6-2c94db55201c" 00:18:38.103 ], 00:18:38.103 "product_name": "Malloc disk", 00:18:38.103 "block_size": 512, 00:18:38.103 "num_blocks": 65536, 00:18:38.103 "uuid": "48498315-448a-4341-aed6-2c94db55201c", 00:18:38.103 "assigned_rate_limits": { 00:18:38.103 "rw_ios_per_sec": 0, 00:18:38.103 "rw_mbytes_per_sec": 0, 00:18:38.103 "r_mbytes_per_sec": 0, 00:18:38.103 "w_mbytes_per_sec": 0 00:18:38.103 }, 00:18:38.103 "claimed": true, 00:18:38.103 "claim_type": "exclusive_write", 00:18:38.103 "zoned": false, 00:18:38.103 "supported_io_types": { 00:18:38.103 "read": true, 00:18:38.103 "write": true, 00:18:38.103 "unmap": true, 00:18:38.103 "flush": true, 00:18:38.103 "reset": true, 00:18:38.103 "nvme_admin": false, 00:18:38.103 "nvme_io": false, 00:18:38.103 "nvme_io_md": false, 00:18:38.103 "write_zeroes": true, 00:18:38.103 "zcopy": true, 00:18:38.103 "get_zone_info": false, 00:18:38.103 "zone_management": false, 00:18:38.103 "zone_append": false, 00:18:38.103 "compare": false, 00:18:38.103 "compare_and_write": false, 00:18:38.103 "abort": true, 00:18:38.103 "seek_hole": false, 00:18:38.103 "seek_data": false, 00:18:38.103 "copy": true, 00:18:38.103 "nvme_iov_md": false 00:18:38.103 }, 00:18:38.103 "memory_domains": [ 00:18:38.103 { 00:18:38.103 "dma_device_id": "system", 00:18:38.103 "dma_device_type": 1 00:18:38.103 }, 00:18:38.103 { 00:18:38.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.103 "dma_device_type": 2 00:18:38.103 } 00:18:38.103 ], 00:18:38.103 "driver_specific": {} 00:18:38.103 }' 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.103 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.361 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:38.620 [2024-07-12 22:25:48.801954] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:38.620 [2024-07-12 22:25:48.801986] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:38.620 [2024-07-12 22:25:48.802037] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.620 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.621 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.621 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.621 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.621 22:25:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.880 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.880 "name": "Existed_Raid", 00:18:38.880 "uuid": "fe6d57a8-b701-4a8e-baff-ef431132dc02", 00:18:38.880 "strip_size_kb": 64, 00:18:38.880 "state": "offline", 00:18:38.880 "raid_level": "raid0", 00:18:38.880 "superblock": false, 00:18:38.880 "num_base_bdevs": 4, 00:18:38.880 "num_base_bdevs_discovered": 3, 00:18:38.880 "num_base_bdevs_operational": 3, 00:18:38.880 "base_bdevs_list": [ 00:18:38.880 { 00:18:38.880 "name": null, 00:18:38.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.880 "is_configured": false, 00:18:38.880 "data_offset": 0, 00:18:38.880 "data_size": 65536 00:18:38.880 }, 00:18:38.880 { 00:18:38.880 "name": "BaseBdev2", 00:18:38.880 "uuid": "11e54683-b152-48b2-8cce-1a2a6b32b711", 00:18:38.880 "is_configured": true, 00:18:38.880 "data_offset": 0, 00:18:38.880 "data_size": 65536 00:18:38.880 }, 00:18:38.880 { 00:18:38.880 "name": "BaseBdev3", 00:18:38.880 "uuid": "26813f22-9d40-4e1e-bf83-6ce1933bef6f", 00:18:38.880 "is_configured": true, 00:18:38.880 "data_offset": 0, 00:18:38.880 "data_size": 65536 00:18:38.880 }, 00:18:38.880 { 00:18:38.880 "name": "BaseBdev4", 00:18:38.880 "uuid": "48498315-448a-4341-aed6-2c94db55201c", 00:18:38.880 "is_configured": true, 00:18:38.880 "data_offset": 0, 00:18:38.880 "data_size": 65536 00:18:38.880 } 00:18:38.880 ] 00:18:38.880 }' 00:18:38.880 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.880 22:25:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:39.568 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:39.568 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:39.568 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.568 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:39.826 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:39.826 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:39.826 22:25:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:40.394 [2024-07-12 22:25:50.415312] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:40.394 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:40.653 [2024-07-12 22:25:50.937194] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:40.653 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:40.653 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:40.653 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.653 22:25:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:40.913 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:40.913 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:40.913 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:41.172 [2024-07-12 22:25:51.421157] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:41.172 [2024-07-12 22:25:51.421204] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b0350 name Existed_Raid, state offline 00:18:41.172 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:41.172 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:41.172 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.172 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:41.431 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:41.690 BaseBdev2 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:41.690 22:25:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:41.690 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:41.949 [ 00:18:41.949 { 00:18:41.949 "name": "BaseBdev2", 00:18:41.949 "aliases": [ 00:18:41.949 "61c6f42f-6096-484a-875a-a72946105dd1" 00:18:41.949 ], 00:18:41.949 "product_name": "Malloc disk", 00:18:41.949 "block_size": 512, 00:18:41.949 "num_blocks": 65536, 00:18:41.949 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:41.949 "assigned_rate_limits": { 00:18:41.950 "rw_ios_per_sec": 0, 00:18:41.950 "rw_mbytes_per_sec": 0, 00:18:41.950 "r_mbytes_per_sec": 0, 00:18:41.950 "w_mbytes_per_sec": 0 00:18:41.950 }, 00:18:41.950 "claimed": false, 00:18:41.950 "zoned": false, 00:18:41.950 "supported_io_types": { 00:18:41.950 "read": true, 00:18:41.950 "write": true, 00:18:41.950 "unmap": true, 00:18:41.950 "flush": true, 00:18:41.950 "reset": true, 00:18:41.950 "nvme_admin": false, 00:18:41.950 "nvme_io": false, 00:18:41.950 "nvme_io_md": false, 00:18:41.950 "write_zeroes": true, 00:18:41.950 "zcopy": true, 00:18:41.950 "get_zone_info": false, 00:18:41.950 "zone_management": false, 00:18:41.950 "zone_append": false, 00:18:41.950 "compare": false, 00:18:41.950 "compare_and_write": false, 00:18:41.950 "abort": true, 00:18:41.950 "seek_hole": false, 00:18:41.950 "seek_data": false, 00:18:41.950 "copy": true, 00:18:41.950 "nvme_iov_md": false 00:18:41.950 }, 00:18:41.950 "memory_domains": [ 00:18:41.950 { 00:18:41.950 "dma_device_id": "system", 00:18:41.950 "dma_device_type": 1 00:18:41.950 }, 00:18:41.950 { 00:18:41.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.950 "dma_device_type": 2 00:18:41.950 } 00:18:41.950 ], 00:18:41.950 "driver_specific": {} 00:18:41.950 } 00:18:41.950 ] 00:18:41.950 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:41.950 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:41.950 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:41.950 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:42.209 BaseBdev3 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:42.209 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:42.468 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:42.468 [ 00:18:42.468 { 00:18:42.468 "name": "BaseBdev3", 00:18:42.468 "aliases": [ 00:18:42.468 "e81192a3-9aab-464c-b946-cafd4f95dfd3" 00:18:42.468 ], 00:18:42.468 "product_name": "Malloc disk", 00:18:42.468 "block_size": 512, 00:18:42.468 "num_blocks": 65536, 00:18:42.468 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:42.468 "assigned_rate_limits": { 00:18:42.468 "rw_ios_per_sec": 0, 00:18:42.468 "rw_mbytes_per_sec": 0, 00:18:42.468 "r_mbytes_per_sec": 0, 00:18:42.468 "w_mbytes_per_sec": 0 00:18:42.468 }, 00:18:42.468 "claimed": false, 00:18:42.468 "zoned": false, 00:18:42.468 "supported_io_types": { 00:18:42.468 "read": true, 00:18:42.468 "write": true, 00:18:42.468 "unmap": true, 00:18:42.468 "flush": true, 00:18:42.468 "reset": true, 00:18:42.468 "nvme_admin": false, 00:18:42.468 "nvme_io": false, 00:18:42.468 "nvme_io_md": false, 00:18:42.468 "write_zeroes": true, 00:18:42.468 "zcopy": true, 00:18:42.468 "get_zone_info": false, 00:18:42.468 "zone_management": false, 00:18:42.468 "zone_append": false, 00:18:42.468 "compare": false, 00:18:42.468 "compare_and_write": false, 00:18:42.468 "abort": true, 00:18:42.468 "seek_hole": false, 00:18:42.468 "seek_data": false, 00:18:42.468 "copy": true, 00:18:42.468 "nvme_iov_md": false 00:18:42.468 }, 00:18:42.468 "memory_domains": [ 00:18:42.468 { 00:18:42.468 "dma_device_id": "system", 00:18:42.468 "dma_device_type": 1 00:18:42.468 }, 00:18:42.468 { 00:18:42.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.468 "dma_device_type": 2 00:18:42.468 } 00:18:42.468 ], 00:18:42.468 "driver_specific": {} 00:18:42.468 } 00:18:42.468 ] 00:18:42.727 22:25:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:42.727 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:42.727 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:42.727 22:25:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:42.727 BaseBdev4 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:42.727 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:42.985 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:43.243 [ 00:18:43.243 { 00:18:43.243 "name": "BaseBdev4", 00:18:43.243 "aliases": [ 00:18:43.243 "dd29686d-9812-4aa5-9deb-96a3e104ac34" 00:18:43.243 ], 00:18:43.243 "product_name": "Malloc disk", 00:18:43.243 "block_size": 512, 00:18:43.243 "num_blocks": 65536, 00:18:43.243 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:43.243 "assigned_rate_limits": { 00:18:43.243 "rw_ios_per_sec": 0, 00:18:43.243 "rw_mbytes_per_sec": 0, 00:18:43.243 "r_mbytes_per_sec": 0, 00:18:43.243 "w_mbytes_per_sec": 0 00:18:43.243 }, 00:18:43.243 "claimed": false, 00:18:43.243 "zoned": false, 00:18:43.243 "supported_io_types": { 00:18:43.243 "read": true, 00:18:43.243 "write": true, 00:18:43.243 "unmap": true, 00:18:43.243 "flush": true, 00:18:43.243 "reset": true, 00:18:43.243 "nvme_admin": false, 00:18:43.243 "nvme_io": false, 00:18:43.243 "nvme_io_md": false, 00:18:43.243 "write_zeroes": true, 00:18:43.243 "zcopy": true, 00:18:43.243 "get_zone_info": false, 00:18:43.243 "zone_management": false, 00:18:43.243 "zone_append": false, 00:18:43.243 "compare": false, 00:18:43.243 "compare_and_write": false, 00:18:43.243 "abort": true, 00:18:43.243 "seek_hole": false, 00:18:43.243 "seek_data": false, 00:18:43.243 "copy": true, 00:18:43.244 "nvme_iov_md": false 00:18:43.244 }, 00:18:43.244 "memory_domains": [ 00:18:43.244 { 00:18:43.244 "dma_device_id": "system", 00:18:43.244 "dma_device_type": 1 00:18:43.244 }, 00:18:43.244 { 00:18:43.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.244 "dma_device_type": 2 00:18:43.244 } 00:18:43.244 ], 00:18:43.244 "driver_specific": {} 00:18:43.244 } 00:18:43.244 ] 00:18:43.244 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:43.244 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:43.244 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:43.244 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:43.503 [2024-07-12 22:25:53.650184] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:43.503 [2024-07-12 22:25:53.650232] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:43.503 [2024-07-12 22:25:53.650251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:43.503 [2024-07-12 22:25:53.651634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:43.503 [2024-07-12 22:25:53.651678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.503 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.761 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.761 "name": "Existed_Raid", 00:18:43.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.761 "strip_size_kb": 64, 00:18:43.761 "state": "configuring", 00:18:43.761 "raid_level": "raid0", 00:18:43.761 "superblock": false, 00:18:43.761 "num_base_bdevs": 4, 00:18:43.761 "num_base_bdevs_discovered": 3, 00:18:43.761 "num_base_bdevs_operational": 4, 00:18:43.761 "base_bdevs_list": [ 00:18:43.761 { 00:18:43.761 "name": "BaseBdev1", 00:18:43.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.761 "is_configured": false, 00:18:43.761 "data_offset": 0, 00:18:43.761 "data_size": 0 00:18:43.761 }, 00:18:43.761 { 00:18:43.761 "name": "BaseBdev2", 00:18:43.761 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:43.761 "is_configured": true, 00:18:43.761 "data_offset": 0, 00:18:43.761 "data_size": 65536 00:18:43.761 }, 00:18:43.761 { 00:18:43.761 "name": "BaseBdev3", 00:18:43.761 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:43.761 "is_configured": true, 00:18:43.761 "data_offset": 0, 00:18:43.761 "data_size": 65536 00:18:43.761 }, 00:18:43.761 { 00:18:43.761 "name": "BaseBdev4", 00:18:43.761 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:43.761 "is_configured": true, 00:18:43.761 "data_offset": 0, 00:18:43.761 "data_size": 65536 00:18:43.761 } 00:18:43.761 ] 00:18:43.761 }' 00:18:43.761 22:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.761 22:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.325 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:44.583 [2024-07-12 22:25:54.700944] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.583 "name": "Existed_Raid", 00:18:44.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.583 "strip_size_kb": 64, 00:18:44.583 "state": "configuring", 00:18:44.583 "raid_level": "raid0", 00:18:44.583 "superblock": false, 00:18:44.583 "num_base_bdevs": 4, 00:18:44.583 "num_base_bdevs_discovered": 2, 00:18:44.583 "num_base_bdevs_operational": 4, 00:18:44.583 "base_bdevs_list": [ 00:18:44.583 { 00:18:44.583 "name": "BaseBdev1", 00:18:44.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.583 "is_configured": false, 00:18:44.583 "data_offset": 0, 00:18:44.583 "data_size": 0 00:18:44.583 }, 00:18:44.583 { 00:18:44.583 "name": null, 00:18:44.583 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:44.583 "is_configured": false, 00:18:44.583 "data_offset": 0, 00:18:44.583 "data_size": 65536 00:18:44.583 }, 00:18:44.583 { 00:18:44.583 "name": "BaseBdev3", 00:18:44.583 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:44.583 "is_configured": true, 00:18:44.583 "data_offset": 0, 00:18:44.583 "data_size": 65536 00:18:44.583 }, 00:18:44.583 { 00:18:44.583 "name": "BaseBdev4", 00:18:44.583 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:44.583 "is_configured": true, 00:18:44.583 "data_offset": 0, 00:18:44.583 "data_size": 65536 00:18:44.583 } 00:18:44.583 ] 00:18:44.583 }' 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.583 22:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.148 22:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.148 22:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:45.406 22:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:45.406 22:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:45.664 [2024-07-12 22:25:55.900713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:45.664 BaseBdev1 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:45.664 22:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:45.921 22:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:46.179 [ 00:18:46.179 { 00:18:46.179 "name": "BaseBdev1", 00:18:46.179 "aliases": [ 00:18:46.179 "f6395c75-c508-4077-aad9-553eaaaccd36" 00:18:46.179 ], 00:18:46.179 "product_name": "Malloc disk", 00:18:46.179 "block_size": 512, 00:18:46.179 "num_blocks": 65536, 00:18:46.179 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:46.179 "assigned_rate_limits": { 00:18:46.179 "rw_ios_per_sec": 0, 00:18:46.179 "rw_mbytes_per_sec": 0, 00:18:46.179 "r_mbytes_per_sec": 0, 00:18:46.179 "w_mbytes_per_sec": 0 00:18:46.179 }, 00:18:46.179 "claimed": true, 00:18:46.179 "claim_type": "exclusive_write", 00:18:46.179 "zoned": false, 00:18:46.179 "supported_io_types": { 00:18:46.179 "read": true, 00:18:46.179 "write": true, 00:18:46.179 "unmap": true, 00:18:46.179 "flush": true, 00:18:46.179 "reset": true, 00:18:46.179 "nvme_admin": false, 00:18:46.179 "nvme_io": false, 00:18:46.179 "nvme_io_md": false, 00:18:46.179 "write_zeroes": true, 00:18:46.179 "zcopy": true, 00:18:46.179 "get_zone_info": false, 00:18:46.179 "zone_management": false, 00:18:46.179 "zone_append": false, 00:18:46.179 "compare": false, 00:18:46.179 "compare_and_write": false, 00:18:46.179 "abort": true, 00:18:46.179 "seek_hole": false, 00:18:46.179 "seek_data": false, 00:18:46.179 "copy": true, 00:18:46.179 "nvme_iov_md": false 00:18:46.179 }, 00:18:46.179 "memory_domains": [ 00:18:46.179 { 00:18:46.179 "dma_device_id": "system", 00:18:46.179 "dma_device_type": 1 00:18:46.179 }, 00:18:46.179 { 00:18:46.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.179 "dma_device_type": 2 00:18:46.179 } 00:18:46.179 ], 00:18:46.179 "driver_specific": {} 00:18:46.179 } 00:18:46.179 ] 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.179 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.437 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.437 "name": "Existed_Raid", 00:18:46.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.437 "strip_size_kb": 64, 00:18:46.437 "state": "configuring", 00:18:46.437 "raid_level": "raid0", 00:18:46.437 "superblock": false, 00:18:46.437 "num_base_bdevs": 4, 00:18:46.437 "num_base_bdevs_discovered": 3, 00:18:46.437 "num_base_bdevs_operational": 4, 00:18:46.437 "base_bdevs_list": [ 00:18:46.437 { 00:18:46.437 "name": "BaseBdev1", 00:18:46.437 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:46.437 "is_configured": true, 00:18:46.437 "data_offset": 0, 00:18:46.437 "data_size": 65536 00:18:46.437 }, 00:18:46.437 { 00:18:46.437 "name": null, 00:18:46.437 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:46.437 "is_configured": false, 00:18:46.437 "data_offset": 0, 00:18:46.437 "data_size": 65536 00:18:46.437 }, 00:18:46.437 { 00:18:46.437 "name": "BaseBdev3", 00:18:46.437 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:46.437 "is_configured": true, 00:18:46.437 "data_offset": 0, 00:18:46.437 "data_size": 65536 00:18:46.437 }, 00:18:46.437 { 00:18:46.437 "name": "BaseBdev4", 00:18:46.437 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:46.437 "is_configured": true, 00:18:46.437 "data_offset": 0, 00:18:46.437 "data_size": 65536 00:18:46.437 } 00:18:46.437 ] 00:18:46.437 }' 00:18:46.437 22:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.437 22:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.005 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.005 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:47.005 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:47.005 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:47.264 [2024-07-12 22:25:57.541111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.264 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.524 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.524 "name": "Existed_Raid", 00:18:47.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.524 "strip_size_kb": 64, 00:18:47.524 "state": "configuring", 00:18:47.524 "raid_level": "raid0", 00:18:47.524 "superblock": false, 00:18:47.524 "num_base_bdevs": 4, 00:18:47.524 "num_base_bdevs_discovered": 2, 00:18:47.524 "num_base_bdevs_operational": 4, 00:18:47.524 "base_bdevs_list": [ 00:18:47.524 { 00:18:47.524 "name": "BaseBdev1", 00:18:47.524 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:47.524 "is_configured": true, 00:18:47.524 "data_offset": 0, 00:18:47.524 "data_size": 65536 00:18:47.524 }, 00:18:47.524 { 00:18:47.524 "name": null, 00:18:47.524 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:47.524 "is_configured": false, 00:18:47.524 "data_offset": 0, 00:18:47.524 "data_size": 65536 00:18:47.524 }, 00:18:47.524 { 00:18:47.524 "name": null, 00:18:47.524 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:47.524 "is_configured": false, 00:18:47.524 "data_offset": 0, 00:18:47.524 "data_size": 65536 00:18:47.524 }, 00:18:47.524 { 00:18:47.524 "name": "BaseBdev4", 00:18:47.524 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:47.524 "is_configured": true, 00:18:47.524 "data_offset": 0, 00:18:47.524 "data_size": 65536 00:18:47.524 } 00:18:47.524 ] 00:18:47.524 }' 00:18:47.524 22:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.524 22:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.092 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.092 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:48.351 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:48.351 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:48.610 [2024-07-12 22:25:58.864642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.610 22:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.869 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.869 "name": "Existed_Raid", 00:18:48.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.869 "strip_size_kb": 64, 00:18:48.869 "state": "configuring", 00:18:48.869 "raid_level": "raid0", 00:18:48.869 "superblock": false, 00:18:48.869 "num_base_bdevs": 4, 00:18:48.869 "num_base_bdevs_discovered": 3, 00:18:48.869 "num_base_bdevs_operational": 4, 00:18:48.869 "base_bdevs_list": [ 00:18:48.869 { 00:18:48.869 "name": "BaseBdev1", 00:18:48.869 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:48.869 "is_configured": true, 00:18:48.870 "data_offset": 0, 00:18:48.870 "data_size": 65536 00:18:48.870 }, 00:18:48.870 { 00:18:48.870 "name": null, 00:18:48.870 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:48.870 "is_configured": false, 00:18:48.870 "data_offset": 0, 00:18:48.870 "data_size": 65536 00:18:48.870 }, 00:18:48.870 { 00:18:48.870 "name": "BaseBdev3", 00:18:48.870 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:48.870 "is_configured": true, 00:18:48.870 "data_offset": 0, 00:18:48.870 "data_size": 65536 00:18:48.870 }, 00:18:48.870 { 00:18:48.870 "name": "BaseBdev4", 00:18:48.870 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:48.870 "is_configured": true, 00:18:48.870 "data_offset": 0, 00:18:48.870 "data_size": 65536 00:18:48.870 } 00:18:48.870 ] 00:18:48.870 }' 00:18:48.870 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.870 22:25:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.438 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.438 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:49.696 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:49.697 22:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:49.955 [2024-07-12 22:26:00.192172] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.955 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.213 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.213 "name": "Existed_Raid", 00:18:50.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.213 "strip_size_kb": 64, 00:18:50.213 "state": "configuring", 00:18:50.213 "raid_level": "raid0", 00:18:50.214 "superblock": false, 00:18:50.214 "num_base_bdevs": 4, 00:18:50.214 "num_base_bdevs_discovered": 2, 00:18:50.214 "num_base_bdevs_operational": 4, 00:18:50.214 "base_bdevs_list": [ 00:18:50.214 { 00:18:50.214 "name": null, 00:18:50.214 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:50.214 "is_configured": false, 00:18:50.214 "data_offset": 0, 00:18:50.214 "data_size": 65536 00:18:50.214 }, 00:18:50.214 { 00:18:50.214 "name": null, 00:18:50.214 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:50.214 "is_configured": false, 00:18:50.214 "data_offset": 0, 00:18:50.214 "data_size": 65536 00:18:50.214 }, 00:18:50.214 { 00:18:50.214 "name": "BaseBdev3", 00:18:50.214 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:50.214 "is_configured": true, 00:18:50.214 "data_offset": 0, 00:18:50.214 "data_size": 65536 00:18:50.214 }, 00:18:50.214 { 00:18:50.214 "name": "BaseBdev4", 00:18:50.214 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:50.214 "is_configured": true, 00:18:50.214 "data_offset": 0, 00:18:50.214 "data_size": 65536 00:18:50.214 } 00:18:50.214 ] 00:18:50.214 }' 00:18:50.214 22:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.214 22:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.779 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.779 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:51.037 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:51.037 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:51.295 [2024-07-12 22:26:01.544300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.295 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.552 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.552 "name": "Existed_Raid", 00:18:51.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.552 "strip_size_kb": 64, 00:18:51.552 "state": "configuring", 00:18:51.552 "raid_level": "raid0", 00:18:51.552 "superblock": false, 00:18:51.552 "num_base_bdevs": 4, 00:18:51.552 "num_base_bdevs_discovered": 3, 00:18:51.552 "num_base_bdevs_operational": 4, 00:18:51.552 "base_bdevs_list": [ 00:18:51.552 { 00:18:51.552 "name": null, 00:18:51.552 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:51.552 "is_configured": false, 00:18:51.552 "data_offset": 0, 00:18:51.552 "data_size": 65536 00:18:51.552 }, 00:18:51.552 { 00:18:51.552 "name": "BaseBdev2", 00:18:51.552 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:51.552 "is_configured": true, 00:18:51.552 "data_offset": 0, 00:18:51.552 "data_size": 65536 00:18:51.553 }, 00:18:51.553 { 00:18:51.553 "name": "BaseBdev3", 00:18:51.553 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:51.553 "is_configured": true, 00:18:51.553 "data_offset": 0, 00:18:51.553 "data_size": 65536 00:18:51.553 }, 00:18:51.553 { 00:18:51.553 "name": "BaseBdev4", 00:18:51.553 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:51.553 "is_configured": true, 00:18:51.553 "data_offset": 0, 00:18:51.553 "data_size": 65536 00:18:51.553 } 00:18:51.553 ] 00:18:51.553 }' 00:18:51.553 22:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.553 22:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.117 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.117 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:52.376 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:52.376 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.376 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:52.634 22:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f6395c75-c508-4077-aad9-553eaaaccd36 00:18:52.892 [2024-07-12 22:26:03.039649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:52.892 [2024-07-12 22:26:03.039689] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21b4040 00:18:52.892 [2024-07-12 22:26:03.039698] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:52.892 [2024-07-12 22:26:03.039899] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21afa70 00:18:52.892 [2024-07-12 22:26:03.040028] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21b4040 00:18:52.892 [2024-07-12 22:26:03.040039] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21b4040 00:18:52.892 [2024-07-12 22:26:03.040205] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.892 NewBaseBdev 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:52.892 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:53.149 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:53.408 [ 00:18:53.408 { 00:18:53.408 "name": "NewBaseBdev", 00:18:53.408 "aliases": [ 00:18:53.408 "f6395c75-c508-4077-aad9-553eaaaccd36" 00:18:53.408 ], 00:18:53.408 "product_name": "Malloc disk", 00:18:53.408 "block_size": 512, 00:18:53.408 "num_blocks": 65536, 00:18:53.408 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:53.408 "assigned_rate_limits": { 00:18:53.408 "rw_ios_per_sec": 0, 00:18:53.408 "rw_mbytes_per_sec": 0, 00:18:53.408 "r_mbytes_per_sec": 0, 00:18:53.408 "w_mbytes_per_sec": 0 00:18:53.408 }, 00:18:53.408 "claimed": true, 00:18:53.408 "claim_type": "exclusive_write", 00:18:53.408 "zoned": false, 00:18:53.408 "supported_io_types": { 00:18:53.408 "read": true, 00:18:53.408 "write": true, 00:18:53.408 "unmap": true, 00:18:53.408 "flush": true, 00:18:53.408 "reset": true, 00:18:53.408 "nvme_admin": false, 00:18:53.408 "nvme_io": false, 00:18:53.408 "nvme_io_md": false, 00:18:53.408 "write_zeroes": true, 00:18:53.408 "zcopy": true, 00:18:53.408 "get_zone_info": false, 00:18:53.408 "zone_management": false, 00:18:53.408 "zone_append": false, 00:18:53.408 "compare": false, 00:18:53.408 "compare_and_write": false, 00:18:53.408 "abort": true, 00:18:53.408 "seek_hole": false, 00:18:53.408 "seek_data": false, 00:18:53.408 "copy": true, 00:18:53.408 "nvme_iov_md": false 00:18:53.408 }, 00:18:53.408 "memory_domains": [ 00:18:53.408 { 00:18:53.408 "dma_device_id": "system", 00:18:53.408 "dma_device_type": 1 00:18:53.408 }, 00:18:53.408 { 00:18:53.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.408 "dma_device_type": 2 00:18:53.408 } 00:18:53.408 ], 00:18:53.408 "driver_specific": {} 00:18:53.408 } 00:18:53.408 ] 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.408 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.667 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.667 "name": "Existed_Raid", 00:18:53.667 "uuid": "a179d3a1-6966-437e-a1c4-ad67042f189e", 00:18:53.667 "strip_size_kb": 64, 00:18:53.667 "state": "online", 00:18:53.667 "raid_level": "raid0", 00:18:53.667 "superblock": false, 00:18:53.667 "num_base_bdevs": 4, 00:18:53.667 "num_base_bdevs_discovered": 4, 00:18:53.667 "num_base_bdevs_operational": 4, 00:18:53.667 "base_bdevs_list": [ 00:18:53.667 { 00:18:53.667 "name": "NewBaseBdev", 00:18:53.667 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:53.667 "is_configured": true, 00:18:53.667 "data_offset": 0, 00:18:53.667 "data_size": 65536 00:18:53.667 }, 00:18:53.667 { 00:18:53.667 "name": "BaseBdev2", 00:18:53.667 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:53.667 "is_configured": true, 00:18:53.667 "data_offset": 0, 00:18:53.667 "data_size": 65536 00:18:53.667 }, 00:18:53.667 { 00:18:53.667 "name": "BaseBdev3", 00:18:53.667 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:53.667 "is_configured": true, 00:18:53.667 "data_offset": 0, 00:18:53.667 "data_size": 65536 00:18:53.667 }, 00:18:53.667 { 00:18:53.667 "name": "BaseBdev4", 00:18:53.667 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:53.667 "is_configured": true, 00:18:53.667 "data_offset": 0, 00:18:53.667 "data_size": 65536 00:18:53.667 } 00:18:53.667 ] 00:18:53.667 }' 00:18:53.667 22:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.667 22:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:54.234 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:54.492 [2024-07-12 22:26:04.608167] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:54.492 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:54.492 "name": "Existed_Raid", 00:18:54.492 "aliases": [ 00:18:54.492 "a179d3a1-6966-437e-a1c4-ad67042f189e" 00:18:54.492 ], 00:18:54.492 "product_name": "Raid Volume", 00:18:54.492 "block_size": 512, 00:18:54.492 "num_blocks": 262144, 00:18:54.492 "uuid": "a179d3a1-6966-437e-a1c4-ad67042f189e", 00:18:54.492 "assigned_rate_limits": { 00:18:54.492 "rw_ios_per_sec": 0, 00:18:54.492 "rw_mbytes_per_sec": 0, 00:18:54.492 "r_mbytes_per_sec": 0, 00:18:54.492 "w_mbytes_per_sec": 0 00:18:54.492 }, 00:18:54.492 "claimed": false, 00:18:54.492 "zoned": false, 00:18:54.492 "supported_io_types": { 00:18:54.492 "read": true, 00:18:54.492 "write": true, 00:18:54.492 "unmap": true, 00:18:54.492 "flush": true, 00:18:54.492 "reset": true, 00:18:54.492 "nvme_admin": false, 00:18:54.492 "nvme_io": false, 00:18:54.492 "nvme_io_md": false, 00:18:54.492 "write_zeroes": true, 00:18:54.492 "zcopy": false, 00:18:54.492 "get_zone_info": false, 00:18:54.492 "zone_management": false, 00:18:54.492 "zone_append": false, 00:18:54.492 "compare": false, 00:18:54.492 "compare_and_write": false, 00:18:54.492 "abort": false, 00:18:54.492 "seek_hole": false, 00:18:54.492 "seek_data": false, 00:18:54.492 "copy": false, 00:18:54.492 "nvme_iov_md": false 00:18:54.492 }, 00:18:54.492 "memory_domains": [ 00:18:54.492 { 00:18:54.492 "dma_device_id": "system", 00:18:54.492 "dma_device_type": 1 00:18:54.492 }, 00:18:54.492 { 00:18:54.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.492 "dma_device_type": 2 00:18:54.492 }, 00:18:54.492 { 00:18:54.492 "dma_device_id": "system", 00:18:54.492 "dma_device_type": 1 00:18:54.492 }, 00:18:54.492 { 00:18:54.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.492 "dma_device_type": 2 00:18:54.492 }, 00:18:54.492 { 00:18:54.492 "dma_device_id": "system", 00:18:54.492 "dma_device_type": 1 00:18:54.492 }, 00:18:54.493 { 00:18:54.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.493 "dma_device_type": 2 00:18:54.493 }, 00:18:54.493 { 00:18:54.493 "dma_device_id": "system", 00:18:54.493 "dma_device_type": 1 00:18:54.493 }, 00:18:54.493 { 00:18:54.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.493 "dma_device_type": 2 00:18:54.493 } 00:18:54.493 ], 00:18:54.493 "driver_specific": { 00:18:54.493 "raid": { 00:18:54.493 "uuid": "a179d3a1-6966-437e-a1c4-ad67042f189e", 00:18:54.493 "strip_size_kb": 64, 00:18:54.493 "state": "online", 00:18:54.493 "raid_level": "raid0", 00:18:54.493 "superblock": false, 00:18:54.493 "num_base_bdevs": 4, 00:18:54.493 "num_base_bdevs_discovered": 4, 00:18:54.493 "num_base_bdevs_operational": 4, 00:18:54.493 "base_bdevs_list": [ 00:18:54.493 { 00:18:54.493 "name": "NewBaseBdev", 00:18:54.493 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:54.493 "is_configured": true, 00:18:54.493 "data_offset": 0, 00:18:54.493 "data_size": 65536 00:18:54.493 }, 00:18:54.493 { 00:18:54.493 "name": "BaseBdev2", 00:18:54.493 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:54.493 "is_configured": true, 00:18:54.493 "data_offset": 0, 00:18:54.493 "data_size": 65536 00:18:54.493 }, 00:18:54.493 { 00:18:54.493 "name": "BaseBdev3", 00:18:54.493 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:54.493 "is_configured": true, 00:18:54.493 "data_offset": 0, 00:18:54.493 "data_size": 65536 00:18:54.493 }, 00:18:54.493 { 00:18:54.493 "name": "BaseBdev4", 00:18:54.493 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:54.493 "is_configured": true, 00:18:54.493 "data_offset": 0, 00:18:54.493 "data_size": 65536 00:18:54.493 } 00:18:54.493 ] 00:18:54.493 } 00:18:54.493 } 00:18:54.493 }' 00:18:54.493 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:54.493 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:54.493 BaseBdev2 00:18:54.493 BaseBdev3 00:18:54.493 BaseBdev4' 00:18:54.493 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.493 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.493 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:54.751 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.751 "name": "NewBaseBdev", 00:18:54.751 "aliases": [ 00:18:54.751 "f6395c75-c508-4077-aad9-553eaaaccd36" 00:18:54.751 ], 00:18:54.751 "product_name": "Malloc disk", 00:18:54.751 "block_size": 512, 00:18:54.751 "num_blocks": 65536, 00:18:54.751 "uuid": "f6395c75-c508-4077-aad9-553eaaaccd36", 00:18:54.751 "assigned_rate_limits": { 00:18:54.751 "rw_ios_per_sec": 0, 00:18:54.751 "rw_mbytes_per_sec": 0, 00:18:54.751 "r_mbytes_per_sec": 0, 00:18:54.751 "w_mbytes_per_sec": 0 00:18:54.751 }, 00:18:54.751 "claimed": true, 00:18:54.751 "claim_type": "exclusive_write", 00:18:54.751 "zoned": false, 00:18:54.751 "supported_io_types": { 00:18:54.751 "read": true, 00:18:54.751 "write": true, 00:18:54.751 "unmap": true, 00:18:54.751 "flush": true, 00:18:54.751 "reset": true, 00:18:54.751 "nvme_admin": false, 00:18:54.751 "nvme_io": false, 00:18:54.751 "nvme_io_md": false, 00:18:54.751 "write_zeroes": true, 00:18:54.751 "zcopy": true, 00:18:54.751 "get_zone_info": false, 00:18:54.751 "zone_management": false, 00:18:54.751 "zone_append": false, 00:18:54.751 "compare": false, 00:18:54.751 "compare_and_write": false, 00:18:54.751 "abort": true, 00:18:54.751 "seek_hole": false, 00:18:54.751 "seek_data": false, 00:18:54.751 "copy": true, 00:18:54.751 "nvme_iov_md": false 00:18:54.751 }, 00:18:54.751 "memory_domains": [ 00:18:54.751 { 00:18:54.751 "dma_device_id": "system", 00:18:54.751 "dma_device_type": 1 00:18:54.751 }, 00:18:54.751 { 00:18:54.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.751 "dma_device_type": 2 00:18:54.751 } 00:18:54.751 ], 00:18:54.751 "driver_specific": {} 00:18:54.751 }' 00:18:54.751 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.751 22:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.751 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.751 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.751 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:55.010 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.314 "name": "BaseBdev2", 00:18:55.314 "aliases": [ 00:18:55.314 "61c6f42f-6096-484a-875a-a72946105dd1" 00:18:55.314 ], 00:18:55.314 "product_name": "Malloc disk", 00:18:55.314 "block_size": 512, 00:18:55.314 "num_blocks": 65536, 00:18:55.314 "uuid": "61c6f42f-6096-484a-875a-a72946105dd1", 00:18:55.314 "assigned_rate_limits": { 00:18:55.314 "rw_ios_per_sec": 0, 00:18:55.314 "rw_mbytes_per_sec": 0, 00:18:55.314 "r_mbytes_per_sec": 0, 00:18:55.314 "w_mbytes_per_sec": 0 00:18:55.314 }, 00:18:55.314 "claimed": true, 00:18:55.314 "claim_type": "exclusive_write", 00:18:55.314 "zoned": false, 00:18:55.314 "supported_io_types": { 00:18:55.314 "read": true, 00:18:55.314 "write": true, 00:18:55.314 "unmap": true, 00:18:55.314 "flush": true, 00:18:55.314 "reset": true, 00:18:55.314 "nvme_admin": false, 00:18:55.314 "nvme_io": false, 00:18:55.314 "nvme_io_md": false, 00:18:55.314 "write_zeroes": true, 00:18:55.314 "zcopy": true, 00:18:55.314 "get_zone_info": false, 00:18:55.314 "zone_management": false, 00:18:55.314 "zone_append": false, 00:18:55.314 "compare": false, 00:18:55.314 "compare_and_write": false, 00:18:55.314 "abort": true, 00:18:55.314 "seek_hole": false, 00:18:55.314 "seek_data": false, 00:18:55.314 "copy": true, 00:18:55.314 "nvme_iov_md": false 00:18:55.314 }, 00:18:55.314 "memory_domains": [ 00:18:55.314 { 00:18:55.314 "dma_device_id": "system", 00:18:55.314 "dma_device_type": 1 00:18:55.314 }, 00:18:55.314 { 00:18:55.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.314 "dma_device_type": 2 00:18:55.314 } 00:18:55.314 ], 00:18:55.314 "driver_specific": {} 00:18:55.314 }' 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.314 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:55.572 22:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.830 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.830 "name": "BaseBdev3", 00:18:55.830 "aliases": [ 00:18:55.830 "e81192a3-9aab-464c-b946-cafd4f95dfd3" 00:18:55.830 ], 00:18:55.830 "product_name": "Malloc disk", 00:18:55.830 "block_size": 512, 00:18:55.830 "num_blocks": 65536, 00:18:55.830 "uuid": "e81192a3-9aab-464c-b946-cafd4f95dfd3", 00:18:55.830 "assigned_rate_limits": { 00:18:55.830 "rw_ios_per_sec": 0, 00:18:55.830 "rw_mbytes_per_sec": 0, 00:18:55.830 "r_mbytes_per_sec": 0, 00:18:55.830 "w_mbytes_per_sec": 0 00:18:55.830 }, 00:18:55.830 "claimed": true, 00:18:55.830 "claim_type": "exclusive_write", 00:18:55.830 "zoned": false, 00:18:55.830 "supported_io_types": { 00:18:55.830 "read": true, 00:18:55.830 "write": true, 00:18:55.830 "unmap": true, 00:18:55.830 "flush": true, 00:18:55.830 "reset": true, 00:18:55.830 "nvme_admin": false, 00:18:55.830 "nvme_io": false, 00:18:55.830 "nvme_io_md": false, 00:18:55.830 "write_zeroes": true, 00:18:55.830 "zcopy": true, 00:18:55.830 "get_zone_info": false, 00:18:55.830 "zone_management": false, 00:18:55.830 "zone_append": false, 00:18:55.830 "compare": false, 00:18:55.830 "compare_and_write": false, 00:18:55.830 "abort": true, 00:18:55.830 "seek_hole": false, 00:18:55.830 "seek_data": false, 00:18:55.830 "copy": true, 00:18:55.830 "nvme_iov_md": false 00:18:55.830 }, 00:18:55.830 "memory_domains": [ 00:18:55.830 { 00:18:55.830 "dma_device_id": "system", 00:18:55.830 "dma_device_type": 1 00:18:55.830 }, 00:18:55.830 { 00:18:55.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.830 "dma_device_type": 2 00:18:55.830 } 00:18:55.830 ], 00:18:55.830 "driver_specific": {} 00:18:55.830 }' 00:18:55.830 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.830 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:56.088 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.346 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:56.346 "name": "BaseBdev4", 00:18:56.346 "aliases": [ 00:18:56.346 "dd29686d-9812-4aa5-9deb-96a3e104ac34" 00:18:56.346 ], 00:18:56.346 "product_name": "Malloc disk", 00:18:56.346 "block_size": 512, 00:18:56.346 "num_blocks": 65536, 00:18:56.346 "uuid": "dd29686d-9812-4aa5-9deb-96a3e104ac34", 00:18:56.346 "assigned_rate_limits": { 00:18:56.346 "rw_ios_per_sec": 0, 00:18:56.346 "rw_mbytes_per_sec": 0, 00:18:56.346 "r_mbytes_per_sec": 0, 00:18:56.346 "w_mbytes_per_sec": 0 00:18:56.346 }, 00:18:56.346 "claimed": true, 00:18:56.346 "claim_type": "exclusive_write", 00:18:56.346 "zoned": false, 00:18:56.346 "supported_io_types": { 00:18:56.346 "read": true, 00:18:56.346 "write": true, 00:18:56.346 "unmap": true, 00:18:56.346 "flush": true, 00:18:56.346 "reset": true, 00:18:56.346 "nvme_admin": false, 00:18:56.346 "nvme_io": false, 00:18:56.346 "nvme_io_md": false, 00:18:56.346 "write_zeroes": true, 00:18:56.346 "zcopy": true, 00:18:56.346 "get_zone_info": false, 00:18:56.346 "zone_management": false, 00:18:56.346 "zone_append": false, 00:18:56.346 "compare": false, 00:18:56.346 "compare_and_write": false, 00:18:56.346 "abort": true, 00:18:56.346 "seek_hole": false, 00:18:56.346 "seek_data": false, 00:18:56.346 "copy": true, 00:18:56.346 "nvme_iov_md": false 00:18:56.346 }, 00:18:56.346 "memory_domains": [ 00:18:56.346 { 00:18:56.346 "dma_device_id": "system", 00:18:56.346 "dma_device_type": 1 00:18:56.346 }, 00:18:56.346 { 00:18:56.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.346 "dma_device_type": 2 00:18:56.346 } 00:18:56.346 ], 00:18:56.346 "driver_specific": {} 00:18:56.346 }' 00:18:56.346 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.604 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.862 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.862 22:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:57.119 [2024-07-12 22:26:07.190704] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:57.119 [2024-07-12 22:26:07.190731] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:57.119 [2024-07-12 22:26:07.190790] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:57.119 [2024-07-12 22:26:07.190850] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:57.119 [2024-07-12 22:26:07.190862] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b4040 name Existed_Raid, state offline 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3481809 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3481809 ']' 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3481809 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:57.119 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3481809 00:18:57.120 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:57.120 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:57.120 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3481809' 00:18:57.120 killing process with pid 3481809 00:18:57.120 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3481809 00:18:57.120 [2024-07-12 22:26:07.260026] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:57.120 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3481809 00:18:57.120 [2024-07-12 22:26:07.297454] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:57.377 22:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:57.377 00:18:57.377 real 0m31.366s 00:18:57.377 user 0m57.729s 00:18:57.377 sys 0m5.518s 00:18:57.377 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:57.377 22:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.377 ************************************ 00:18:57.377 END TEST raid_state_function_test 00:18:57.377 ************************************ 00:18:57.378 22:26:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:57.378 22:26:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:57.378 22:26:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:57.378 22:26:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:57.378 22:26:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:57.378 ************************************ 00:18:57.378 START TEST raid_state_function_test_sb 00:18:57.378 ************************************ 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3486555 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3486555' 00:18:57.378 Process raid pid: 3486555 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3486555 /var/tmp/spdk-raid.sock 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3486555 ']' 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:57.378 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:57.378 22:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:57.378 [2024-07-12 22:26:07.673374] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:18:57.378 [2024-07-12 22:26:07.673440] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:57.636 [2024-07-12 22:26:07.797915] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:57.636 [2024-07-12 22:26:07.896684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:57.636 [2024-07-12 22:26:07.960757] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.636 [2024-07-12 22:26:07.960786] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:58.572 [2024-07-12 22:26:08.807637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:58.572 [2024-07-12 22:26:08.807680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:58.572 [2024-07-12 22:26:08.807691] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:58.572 [2024-07-12 22:26:08.807703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:58.572 [2024-07-12 22:26:08.807712] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:58.572 [2024-07-12 22:26:08.807734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:58.572 [2024-07-12 22:26:08.807743] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:58.572 [2024-07-12 22:26:08.807754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.572 22:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:58.830 22:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.830 "name": "Existed_Raid", 00:18:58.830 "uuid": "305942c6-02eb-4d4f-aedb-f0267916bba6", 00:18:58.830 "strip_size_kb": 64, 00:18:58.830 "state": "configuring", 00:18:58.830 "raid_level": "raid0", 00:18:58.830 "superblock": true, 00:18:58.830 "num_base_bdevs": 4, 00:18:58.830 "num_base_bdevs_discovered": 0, 00:18:58.830 "num_base_bdevs_operational": 4, 00:18:58.830 "base_bdevs_list": [ 00:18:58.830 { 00:18:58.830 "name": "BaseBdev1", 00:18:58.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.830 "is_configured": false, 00:18:58.830 "data_offset": 0, 00:18:58.830 "data_size": 0 00:18:58.830 }, 00:18:58.830 { 00:18:58.830 "name": "BaseBdev2", 00:18:58.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.830 "is_configured": false, 00:18:58.830 "data_offset": 0, 00:18:58.830 "data_size": 0 00:18:58.830 }, 00:18:58.830 { 00:18:58.830 "name": "BaseBdev3", 00:18:58.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.830 "is_configured": false, 00:18:58.830 "data_offset": 0, 00:18:58.830 "data_size": 0 00:18:58.830 }, 00:18:58.830 { 00:18:58.830 "name": "BaseBdev4", 00:18:58.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:58.830 "is_configured": false, 00:18:58.830 "data_offset": 0, 00:18:58.830 "data_size": 0 00:18:58.830 } 00:18:58.830 ] 00:18:58.830 }' 00:18:58.830 22:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.830 22:26:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:59.763 22:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:00.021 [2024-07-12 22:26:10.191129] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:00.021 [2024-07-12 22:26:10.191165] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cceaa0 name Existed_Raid, state configuring 00:19:00.021 22:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:00.278 [2024-07-12 22:26:10.379656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:00.278 [2024-07-12 22:26:10.379685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:00.278 [2024-07-12 22:26:10.379695] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:00.278 [2024-07-12 22:26:10.379707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:00.278 [2024-07-12 22:26:10.379720] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:00.278 [2024-07-12 22:26:10.379732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:00.278 [2024-07-12 22:26:10.379740] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:00.278 [2024-07-12 22:26:10.379751] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:00.278 22:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:00.536 [2024-07-12 22:26:10.634119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.536 BaseBdev1 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:00.536 22:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:00.793 [ 00:19:00.793 { 00:19:00.793 "name": "BaseBdev1", 00:19:00.793 "aliases": [ 00:19:00.793 "c8232c30-3151-44af-a5b8-8308b1b9790c" 00:19:00.793 ], 00:19:00.793 "product_name": "Malloc disk", 00:19:00.793 "block_size": 512, 00:19:00.793 "num_blocks": 65536, 00:19:00.793 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:00.793 "assigned_rate_limits": { 00:19:00.793 "rw_ios_per_sec": 0, 00:19:00.793 "rw_mbytes_per_sec": 0, 00:19:00.793 "r_mbytes_per_sec": 0, 00:19:00.793 "w_mbytes_per_sec": 0 00:19:00.793 }, 00:19:00.793 "claimed": true, 00:19:00.793 "claim_type": "exclusive_write", 00:19:00.793 "zoned": false, 00:19:00.793 "supported_io_types": { 00:19:00.793 "read": true, 00:19:00.793 "write": true, 00:19:00.793 "unmap": true, 00:19:00.793 "flush": true, 00:19:00.793 "reset": true, 00:19:00.793 "nvme_admin": false, 00:19:00.793 "nvme_io": false, 00:19:00.793 "nvme_io_md": false, 00:19:00.793 "write_zeroes": true, 00:19:00.793 "zcopy": true, 00:19:00.793 "get_zone_info": false, 00:19:00.793 "zone_management": false, 00:19:00.793 "zone_append": false, 00:19:00.793 "compare": false, 00:19:00.793 "compare_and_write": false, 00:19:00.793 "abort": true, 00:19:00.793 "seek_hole": false, 00:19:00.793 "seek_data": false, 00:19:00.793 "copy": true, 00:19:00.793 "nvme_iov_md": false 00:19:00.793 }, 00:19:00.793 "memory_domains": [ 00:19:00.793 { 00:19:00.793 "dma_device_id": "system", 00:19:00.793 "dma_device_type": 1 00:19:00.793 }, 00:19:00.793 { 00:19:00.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.793 "dma_device_type": 2 00:19:00.793 } 00:19:00.793 ], 00:19:00.793 "driver_specific": {} 00:19:00.793 } 00:19:00.793 ] 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.793 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.051 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.051 "name": "Existed_Raid", 00:19:01.051 "uuid": "a6cae444-a92e-4964-a818-e00cbbc54e33", 00:19:01.051 "strip_size_kb": 64, 00:19:01.051 "state": "configuring", 00:19:01.051 "raid_level": "raid0", 00:19:01.051 "superblock": true, 00:19:01.051 "num_base_bdevs": 4, 00:19:01.051 "num_base_bdevs_discovered": 1, 00:19:01.051 "num_base_bdevs_operational": 4, 00:19:01.051 "base_bdevs_list": [ 00:19:01.051 { 00:19:01.051 "name": "BaseBdev1", 00:19:01.051 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:01.051 "is_configured": true, 00:19:01.051 "data_offset": 2048, 00:19:01.051 "data_size": 63488 00:19:01.051 }, 00:19:01.051 { 00:19:01.051 "name": "BaseBdev2", 00:19:01.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.051 "is_configured": false, 00:19:01.051 "data_offset": 0, 00:19:01.051 "data_size": 0 00:19:01.051 }, 00:19:01.051 { 00:19:01.051 "name": "BaseBdev3", 00:19:01.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.051 "is_configured": false, 00:19:01.051 "data_offset": 0, 00:19:01.051 "data_size": 0 00:19:01.051 }, 00:19:01.051 { 00:19:01.051 "name": "BaseBdev4", 00:19:01.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.051 "is_configured": false, 00:19:01.051 "data_offset": 0, 00:19:01.051 "data_size": 0 00:19:01.051 } 00:19:01.051 ] 00:19:01.051 }' 00:19:01.051 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.051 22:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.616 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:01.616 [2024-07-12 22:26:11.917543] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:01.616 [2024-07-12 22:26:11.917585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cce310 name Existed_Raid, state configuring 00:19:01.616 22:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:01.874 [2024-07-12 22:26:12.162234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:01.874 [2024-07-12 22:26:12.163683] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:01.874 [2024-07-12 22:26:12.163715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:01.874 [2024-07-12 22:26:12.163725] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:01.874 [2024-07-12 22:26:12.163737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:01.874 [2024-07-12 22:26:12.163746] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:01.874 [2024-07-12 22:26:12.163758] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.874 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.131 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.131 "name": "Existed_Raid", 00:19:02.131 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:02.131 "strip_size_kb": 64, 00:19:02.131 "state": "configuring", 00:19:02.131 "raid_level": "raid0", 00:19:02.131 "superblock": true, 00:19:02.131 "num_base_bdevs": 4, 00:19:02.131 "num_base_bdevs_discovered": 1, 00:19:02.131 "num_base_bdevs_operational": 4, 00:19:02.131 "base_bdevs_list": [ 00:19:02.131 { 00:19:02.131 "name": "BaseBdev1", 00:19:02.131 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:02.131 "is_configured": true, 00:19:02.131 "data_offset": 2048, 00:19:02.131 "data_size": 63488 00:19:02.131 }, 00:19:02.131 { 00:19:02.131 "name": "BaseBdev2", 00:19:02.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.131 "is_configured": false, 00:19:02.131 "data_offset": 0, 00:19:02.131 "data_size": 0 00:19:02.131 }, 00:19:02.131 { 00:19:02.131 "name": "BaseBdev3", 00:19:02.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.131 "is_configured": false, 00:19:02.131 "data_offset": 0, 00:19:02.131 "data_size": 0 00:19:02.131 }, 00:19:02.131 { 00:19:02.131 "name": "BaseBdev4", 00:19:02.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.131 "is_configured": false, 00:19:02.131 "data_offset": 0, 00:19:02.131 "data_size": 0 00:19:02.131 } 00:19:02.131 ] 00:19:02.131 }' 00:19:02.131 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.131 22:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:02.709 22:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:02.966 [2024-07-12 22:26:13.184427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:02.966 BaseBdev2 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:02.966 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.223 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:03.481 [ 00:19:03.481 { 00:19:03.481 "name": "BaseBdev2", 00:19:03.481 "aliases": [ 00:19:03.481 "b6247073-35ab-4985-b858-9bf79609be4f" 00:19:03.481 ], 00:19:03.481 "product_name": "Malloc disk", 00:19:03.481 "block_size": 512, 00:19:03.481 "num_blocks": 65536, 00:19:03.481 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:03.481 "assigned_rate_limits": { 00:19:03.481 "rw_ios_per_sec": 0, 00:19:03.481 "rw_mbytes_per_sec": 0, 00:19:03.481 "r_mbytes_per_sec": 0, 00:19:03.481 "w_mbytes_per_sec": 0 00:19:03.481 }, 00:19:03.481 "claimed": true, 00:19:03.481 "claim_type": "exclusive_write", 00:19:03.481 "zoned": false, 00:19:03.481 "supported_io_types": { 00:19:03.481 "read": true, 00:19:03.481 "write": true, 00:19:03.481 "unmap": true, 00:19:03.481 "flush": true, 00:19:03.481 "reset": true, 00:19:03.481 "nvme_admin": false, 00:19:03.481 "nvme_io": false, 00:19:03.481 "nvme_io_md": false, 00:19:03.481 "write_zeroes": true, 00:19:03.481 "zcopy": true, 00:19:03.481 "get_zone_info": false, 00:19:03.481 "zone_management": false, 00:19:03.481 "zone_append": false, 00:19:03.481 "compare": false, 00:19:03.481 "compare_and_write": false, 00:19:03.481 "abort": true, 00:19:03.481 "seek_hole": false, 00:19:03.481 "seek_data": false, 00:19:03.481 "copy": true, 00:19:03.481 "nvme_iov_md": false 00:19:03.481 }, 00:19:03.481 "memory_domains": [ 00:19:03.481 { 00:19:03.481 "dma_device_id": "system", 00:19:03.481 "dma_device_type": 1 00:19:03.481 }, 00:19:03.481 { 00:19:03.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.481 "dma_device_type": 2 00:19:03.481 } 00:19:03.481 ], 00:19:03.481 "driver_specific": {} 00:19:03.481 } 00:19:03.481 ] 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.481 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:03.738 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.738 "name": "Existed_Raid", 00:19:03.738 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:03.738 "strip_size_kb": 64, 00:19:03.738 "state": "configuring", 00:19:03.738 "raid_level": "raid0", 00:19:03.738 "superblock": true, 00:19:03.738 "num_base_bdevs": 4, 00:19:03.738 "num_base_bdevs_discovered": 2, 00:19:03.738 "num_base_bdevs_operational": 4, 00:19:03.738 "base_bdevs_list": [ 00:19:03.738 { 00:19:03.738 "name": "BaseBdev1", 00:19:03.738 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:03.738 "is_configured": true, 00:19:03.738 "data_offset": 2048, 00:19:03.738 "data_size": 63488 00:19:03.738 }, 00:19:03.738 { 00:19:03.738 "name": "BaseBdev2", 00:19:03.738 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:03.738 "is_configured": true, 00:19:03.738 "data_offset": 2048, 00:19:03.738 "data_size": 63488 00:19:03.738 }, 00:19:03.738 { 00:19:03.738 "name": "BaseBdev3", 00:19:03.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.738 "is_configured": false, 00:19:03.738 "data_offset": 0, 00:19:03.738 "data_size": 0 00:19:03.738 }, 00:19:03.738 { 00:19:03.738 "name": "BaseBdev4", 00:19:03.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.738 "is_configured": false, 00:19:03.738 "data_offset": 0, 00:19:03.738 "data_size": 0 00:19:03.738 } 00:19:03.738 ] 00:19:03.738 }' 00:19:03.738 22:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.738 22:26:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:04.304 22:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:04.562 [2024-07-12 22:26:14.635715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:04.562 BaseBdev3 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:04.562 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:04.819 22:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:04.819 [ 00:19:04.819 { 00:19:04.819 "name": "BaseBdev3", 00:19:04.819 "aliases": [ 00:19:04.819 "3c44a257-eb77-4db3-b7c7-046663d130e4" 00:19:04.819 ], 00:19:04.819 "product_name": "Malloc disk", 00:19:04.819 "block_size": 512, 00:19:04.819 "num_blocks": 65536, 00:19:04.819 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:04.819 "assigned_rate_limits": { 00:19:04.819 "rw_ios_per_sec": 0, 00:19:04.819 "rw_mbytes_per_sec": 0, 00:19:04.819 "r_mbytes_per_sec": 0, 00:19:04.819 "w_mbytes_per_sec": 0 00:19:04.819 }, 00:19:04.819 "claimed": true, 00:19:04.819 "claim_type": "exclusive_write", 00:19:04.819 "zoned": false, 00:19:04.819 "supported_io_types": { 00:19:04.819 "read": true, 00:19:04.819 "write": true, 00:19:04.819 "unmap": true, 00:19:04.819 "flush": true, 00:19:04.819 "reset": true, 00:19:04.819 "nvme_admin": false, 00:19:04.819 "nvme_io": false, 00:19:04.819 "nvme_io_md": false, 00:19:04.819 "write_zeroes": true, 00:19:04.819 "zcopy": true, 00:19:04.819 "get_zone_info": false, 00:19:04.819 "zone_management": false, 00:19:04.819 "zone_append": false, 00:19:04.819 "compare": false, 00:19:04.819 "compare_and_write": false, 00:19:04.819 "abort": true, 00:19:04.819 "seek_hole": false, 00:19:04.819 "seek_data": false, 00:19:04.819 "copy": true, 00:19:04.819 "nvme_iov_md": false 00:19:04.819 }, 00:19:04.819 "memory_domains": [ 00:19:04.819 { 00:19:04.819 "dma_device_id": "system", 00:19:04.819 "dma_device_type": 1 00:19:04.819 }, 00:19:04.819 { 00:19:04.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.819 "dma_device_type": 2 00:19:04.819 } 00:19:04.819 ], 00:19:04.819 "driver_specific": {} 00:19:04.819 } 00:19:04.819 ] 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.819 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.820 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.077 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.077 "name": "Existed_Raid", 00:19:05.077 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:05.077 "strip_size_kb": 64, 00:19:05.077 "state": "configuring", 00:19:05.077 "raid_level": "raid0", 00:19:05.077 "superblock": true, 00:19:05.077 "num_base_bdevs": 4, 00:19:05.077 "num_base_bdevs_discovered": 3, 00:19:05.077 "num_base_bdevs_operational": 4, 00:19:05.077 "base_bdevs_list": [ 00:19:05.077 { 00:19:05.077 "name": "BaseBdev1", 00:19:05.077 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:05.077 "is_configured": true, 00:19:05.077 "data_offset": 2048, 00:19:05.077 "data_size": 63488 00:19:05.077 }, 00:19:05.077 { 00:19:05.077 "name": "BaseBdev2", 00:19:05.077 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:05.077 "is_configured": true, 00:19:05.077 "data_offset": 2048, 00:19:05.077 "data_size": 63488 00:19:05.077 }, 00:19:05.077 { 00:19:05.077 "name": "BaseBdev3", 00:19:05.077 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:05.077 "is_configured": true, 00:19:05.077 "data_offset": 2048, 00:19:05.077 "data_size": 63488 00:19:05.077 }, 00:19:05.077 { 00:19:05.077 "name": "BaseBdev4", 00:19:05.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.077 "is_configured": false, 00:19:05.077 "data_offset": 0, 00:19:05.077 "data_size": 0 00:19:05.077 } 00:19:05.077 ] 00:19:05.077 }' 00:19:05.077 22:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.077 22:26:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:06.011 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:06.011 [2024-07-12 22:26:16.331646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:06.011 [2024-07-12 22:26:16.331814] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ccf350 00:19:06.011 [2024-07-12 22:26:16.331828] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:06.011 [2024-07-12 22:26:16.332015] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ccf020 00:19:06.011 [2024-07-12 22:26:16.332135] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ccf350 00:19:06.011 [2024-07-12 22:26:16.332145] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ccf350 00:19:06.011 [2024-07-12 22:26:16.332235] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.011 BaseBdev4 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:06.268 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:06.525 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:06.526 [ 00:19:06.526 { 00:19:06.526 "name": "BaseBdev4", 00:19:06.526 "aliases": [ 00:19:06.526 "06bfaeca-85f5-4b21-b56d-7783ebf7b245" 00:19:06.526 ], 00:19:06.526 "product_name": "Malloc disk", 00:19:06.526 "block_size": 512, 00:19:06.526 "num_blocks": 65536, 00:19:06.526 "uuid": "06bfaeca-85f5-4b21-b56d-7783ebf7b245", 00:19:06.526 "assigned_rate_limits": { 00:19:06.526 "rw_ios_per_sec": 0, 00:19:06.526 "rw_mbytes_per_sec": 0, 00:19:06.526 "r_mbytes_per_sec": 0, 00:19:06.526 "w_mbytes_per_sec": 0 00:19:06.526 }, 00:19:06.526 "claimed": true, 00:19:06.526 "claim_type": "exclusive_write", 00:19:06.526 "zoned": false, 00:19:06.526 "supported_io_types": { 00:19:06.526 "read": true, 00:19:06.526 "write": true, 00:19:06.526 "unmap": true, 00:19:06.526 "flush": true, 00:19:06.526 "reset": true, 00:19:06.526 "nvme_admin": false, 00:19:06.526 "nvme_io": false, 00:19:06.526 "nvme_io_md": false, 00:19:06.526 "write_zeroes": true, 00:19:06.526 "zcopy": true, 00:19:06.526 "get_zone_info": false, 00:19:06.526 "zone_management": false, 00:19:06.526 "zone_append": false, 00:19:06.526 "compare": false, 00:19:06.526 "compare_and_write": false, 00:19:06.526 "abort": true, 00:19:06.526 "seek_hole": false, 00:19:06.526 "seek_data": false, 00:19:06.526 "copy": true, 00:19:06.526 "nvme_iov_md": false 00:19:06.526 }, 00:19:06.526 "memory_domains": [ 00:19:06.526 { 00:19:06.526 "dma_device_id": "system", 00:19:06.526 "dma_device_type": 1 00:19:06.526 }, 00:19:06.526 { 00:19:06.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.526 "dma_device_type": 2 00:19:06.526 } 00:19:06.526 ], 00:19:06.526 "driver_specific": {} 00:19:06.526 } 00:19:06.526 ] 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.526 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.784 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.784 22:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.784 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.784 "name": "Existed_Raid", 00:19:06.784 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:06.784 "strip_size_kb": 64, 00:19:06.784 "state": "online", 00:19:06.784 "raid_level": "raid0", 00:19:06.784 "superblock": true, 00:19:06.784 "num_base_bdevs": 4, 00:19:06.784 "num_base_bdevs_discovered": 4, 00:19:06.784 "num_base_bdevs_operational": 4, 00:19:06.784 "base_bdevs_list": [ 00:19:06.784 { 00:19:06.784 "name": "BaseBdev1", 00:19:06.784 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:06.784 "is_configured": true, 00:19:06.784 "data_offset": 2048, 00:19:06.784 "data_size": 63488 00:19:06.784 }, 00:19:06.784 { 00:19:06.784 "name": "BaseBdev2", 00:19:06.784 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:06.784 "is_configured": true, 00:19:06.784 "data_offset": 2048, 00:19:06.784 "data_size": 63488 00:19:06.784 }, 00:19:06.784 { 00:19:06.784 "name": "BaseBdev3", 00:19:06.784 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:06.784 "is_configured": true, 00:19:06.784 "data_offset": 2048, 00:19:06.784 "data_size": 63488 00:19:06.784 }, 00:19:06.784 { 00:19:06.784 "name": "BaseBdev4", 00:19:06.784 "uuid": "06bfaeca-85f5-4b21-b56d-7783ebf7b245", 00:19:06.784 "is_configured": true, 00:19:06.784 "data_offset": 2048, 00:19:06.784 "data_size": 63488 00:19:06.784 } 00:19:06.784 ] 00:19:06.784 }' 00:19:06.784 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.784 22:26:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:07.717 [2024-07-12 22:26:17.944321] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:07.717 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:07.717 "name": "Existed_Raid", 00:19:07.717 "aliases": [ 00:19:07.717 "b51a3720-d8a2-4c31-b6bd-462b69e7266e" 00:19:07.717 ], 00:19:07.717 "product_name": "Raid Volume", 00:19:07.717 "block_size": 512, 00:19:07.717 "num_blocks": 253952, 00:19:07.717 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:07.717 "assigned_rate_limits": { 00:19:07.717 "rw_ios_per_sec": 0, 00:19:07.717 "rw_mbytes_per_sec": 0, 00:19:07.717 "r_mbytes_per_sec": 0, 00:19:07.717 "w_mbytes_per_sec": 0 00:19:07.717 }, 00:19:07.717 "claimed": false, 00:19:07.717 "zoned": false, 00:19:07.717 "supported_io_types": { 00:19:07.717 "read": true, 00:19:07.717 "write": true, 00:19:07.717 "unmap": true, 00:19:07.717 "flush": true, 00:19:07.717 "reset": true, 00:19:07.717 "nvme_admin": false, 00:19:07.717 "nvme_io": false, 00:19:07.717 "nvme_io_md": false, 00:19:07.717 "write_zeroes": true, 00:19:07.717 "zcopy": false, 00:19:07.717 "get_zone_info": false, 00:19:07.717 "zone_management": false, 00:19:07.717 "zone_append": false, 00:19:07.717 "compare": false, 00:19:07.717 "compare_and_write": false, 00:19:07.717 "abort": false, 00:19:07.717 "seek_hole": false, 00:19:07.717 "seek_data": false, 00:19:07.717 "copy": false, 00:19:07.717 "nvme_iov_md": false 00:19:07.717 }, 00:19:07.717 "memory_domains": [ 00:19:07.717 { 00:19:07.717 "dma_device_id": "system", 00:19:07.717 "dma_device_type": 1 00:19:07.717 }, 00:19:07.717 { 00:19:07.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.717 "dma_device_type": 2 00:19:07.717 }, 00:19:07.717 { 00:19:07.717 "dma_device_id": "system", 00:19:07.717 "dma_device_type": 1 00:19:07.717 }, 00:19:07.717 { 00:19:07.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.717 "dma_device_type": 2 00:19:07.717 }, 00:19:07.717 { 00:19:07.717 "dma_device_id": "system", 00:19:07.717 "dma_device_type": 1 00:19:07.717 }, 00:19:07.717 { 00:19:07.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.717 "dma_device_type": 2 00:19:07.717 }, 00:19:07.717 { 00:19:07.718 "dma_device_id": "system", 00:19:07.718 "dma_device_type": 1 00:19:07.718 }, 00:19:07.718 { 00:19:07.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.718 "dma_device_type": 2 00:19:07.718 } 00:19:07.718 ], 00:19:07.718 "driver_specific": { 00:19:07.718 "raid": { 00:19:07.718 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:07.718 "strip_size_kb": 64, 00:19:07.718 "state": "online", 00:19:07.718 "raid_level": "raid0", 00:19:07.718 "superblock": true, 00:19:07.718 "num_base_bdevs": 4, 00:19:07.718 "num_base_bdevs_discovered": 4, 00:19:07.718 "num_base_bdevs_operational": 4, 00:19:07.718 "base_bdevs_list": [ 00:19:07.718 { 00:19:07.718 "name": "BaseBdev1", 00:19:07.718 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:07.718 "is_configured": true, 00:19:07.718 "data_offset": 2048, 00:19:07.718 "data_size": 63488 00:19:07.718 }, 00:19:07.718 { 00:19:07.718 "name": "BaseBdev2", 00:19:07.718 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:07.718 "is_configured": true, 00:19:07.718 "data_offset": 2048, 00:19:07.718 "data_size": 63488 00:19:07.718 }, 00:19:07.718 { 00:19:07.718 "name": "BaseBdev3", 00:19:07.718 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:07.718 "is_configured": true, 00:19:07.718 "data_offset": 2048, 00:19:07.718 "data_size": 63488 00:19:07.718 }, 00:19:07.718 { 00:19:07.718 "name": "BaseBdev4", 00:19:07.718 "uuid": "06bfaeca-85f5-4b21-b56d-7783ebf7b245", 00:19:07.718 "is_configured": true, 00:19:07.718 "data_offset": 2048, 00:19:07.718 "data_size": 63488 00:19:07.718 } 00:19:07.718 ] 00:19:07.718 } 00:19:07.718 } 00:19:07.718 }' 00:19:07.718 22:26:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:07.718 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:07.718 BaseBdev2 00:19:07.718 BaseBdev3 00:19:07.718 BaseBdev4' 00:19:07.718 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:07.718 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:07.718 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:07.975 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:07.975 "name": "BaseBdev1", 00:19:07.975 "aliases": [ 00:19:07.975 "c8232c30-3151-44af-a5b8-8308b1b9790c" 00:19:07.975 ], 00:19:07.975 "product_name": "Malloc disk", 00:19:07.975 "block_size": 512, 00:19:07.975 "num_blocks": 65536, 00:19:07.975 "uuid": "c8232c30-3151-44af-a5b8-8308b1b9790c", 00:19:07.975 "assigned_rate_limits": { 00:19:07.975 "rw_ios_per_sec": 0, 00:19:07.975 "rw_mbytes_per_sec": 0, 00:19:07.975 "r_mbytes_per_sec": 0, 00:19:07.975 "w_mbytes_per_sec": 0 00:19:07.975 }, 00:19:07.975 "claimed": true, 00:19:07.975 "claim_type": "exclusive_write", 00:19:07.975 "zoned": false, 00:19:07.975 "supported_io_types": { 00:19:07.975 "read": true, 00:19:07.975 "write": true, 00:19:07.975 "unmap": true, 00:19:07.975 "flush": true, 00:19:07.975 "reset": true, 00:19:07.975 "nvme_admin": false, 00:19:07.975 "nvme_io": false, 00:19:07.975 "nvme_io_md": false, 00:19:07.975 "write_zeroes": true, 00:19:07.975 "zcopy": true, 00:19:07.975 "get_zone_info": false, 00:19:07.975 "zone_management": false, 00:19:07.975 "zone_append": false, 00:19:07.975 "compare": false, 00:19:07.975 "compare_and_write": false, 00:19:07.975 "abort": true, 00:19:07.975 "seek_hole": false, 00:19:07.975 "seek_data": false, 00:19:07.975 "copy": true, 00:19:07.975 "nvme_iov_md": false 00:19:07.975 }, 00:19:07.975 "memory_domains": [ 00:19:07.975 { 00:19:07.975 "dma_device_id": "system", 00:19:07.975 "dma_device_type": 1 00:19:07.975 }, 00:19:07.975 { 00:19:07.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.975 "dma_device_type": 2 00:19:07.975 } 00:19:07.975 ], 00:19:07.975 "driver_specific": {} 00:19:07.975 }' 00:19:07.975 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.233 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.490 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.490 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.490 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:08.490 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:08.748 "name": "BaseBdev2", 00:19:08.748 "aliases": [ 00:19:08.748 "b6247073-35ab-4985-b858-9bf79609be4f" 00:19:08.748 ], 00:19:08.748 "product_name": "Malloc disk", 00:19:08.748 "block_size": 512, 00:19:08.748 "num_blocks": 65536, 00:19:08.748 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:08.748 "assigned_rate_limits": { 00:19:08.748 "rw_ios_per_sec": 0, 00:19:08.748 "rw_mbytes_per_sec": 0, 00:19:08.748 "r_mbytes_per_sec": 0, 00:19:08.748 "w_mbytes_per_sec": 0 00:19:08.748 }, 00:19:08.748 "claimed": true, 00:19:08.748 "claim_type": "exclusive_write", 00:19:08.748 "zoned": false, 00:19:08.748 "supported_io_types": { 00:19:08.748 "read": true, 00:19:08.748 "write": true, 00:19:08.748 "unmap": true, 00:19:08.748 "flush": true, 00:19:08.748 "reset": true, 00:19:08.748 "nvme_admin": false, 00:19:08.748 "nvme_io": false, 00:19:08.748 "nvme_io_md": false, 00:19:08.748 "write_zeroes": true, 00:19:08.748 "zcopy": true, 00:19:08.748 "get_zone_info": false, 00:19:08.748 "zone_management": false, 00:19:08.748 "zone_append": false, 00:19:08.748 "compare": false, 00:19:08.748 "compare_and_write": false, 00:19:08.748 "abort": true, 00:19:08.748 "seek_hole": false, 00:19:08.748 "seek_data": false, 00:19:08.748 "copy": true, 00:19:08.748 "nvme_iov_md": false 00:19:08.748 }, 00:19:08.748 "memory_domains": [ 00:19:08.748 { 00:19:08.748 "dma_device_id": "system", 00:19:08.748 "dma_device_type": 1 00:19:08.748 }, 00:19:08.748 { 00:19:08.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.748 "dma_device_type": 2 00:19:08.748 } 00:19:08.748 ], 00:19:08.748 "driver_specific": {} 00:19:08.748 }' 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.748 22:26:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.748 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.748 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.748 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:09.005 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.262 "name": "BaseBdev3", 00:19:09.262 "aliases": [ 00:19:09.262 "3c44a257-eb77-4db3-b7c7-046663d130e4" 00:19:09.262 ], 00:19:09.262 "product_name": "Malloc disk", 00:19:09.262 "block_size": 512, 00:19:09.262 "num_blocks": 65536, 00:19:09.262 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:09.262 "assigned_rate_limits": { 00:19:09.262 "rw_ios_per_sec": 0, 00:19:09.262 "rw_mbytes_per_sec": 0, 00:19:09.262 "r_mbytes_per_sec": 0, 00:19:09.262 "w_mbytes_per_sec": 0 00:19:09.262 }, 00:19:09.262 "claimed": true, 00:19:09.262 "claim_type": "exclusive_write", 00:19:09.262 "zoned": false, 00:19:09.262 "supported_io_types": { 00:19:09.262 "read": true, 00:19:09.262 "write": true, 00:19:09.262 "unmap": true, 00:19:09.262 "flush": true, 00:19:09.262 "reset": true, 00:19:09.262 "nvme_admin": false, 00:19:09.262 "nvme_io": false, 00:19:09.262 "nvme_io_md": false, 00:19:09.262 "write_zeroes": true, 00:19:09.262 "zcopy": true, 00:19:09.262 "get_zone_info": false, 00:19:09.262 "zone_management": false, 00:19:09.262 "zone_append": false, 00:19:09.262 "compare": false, 00:19:09.262 "compare_and_write": false, 00:19:09.262 "abort": true, 00:19:09.262 "seek_hole": false, 00:19:09.262 "seek_data": false, 00:19:09.262 "copy": true, 00:19:09.262 "nvme_iov_md": false 00:19:09.262 }, 00:19:09.262 "memory_domains": [ 00:19:09.262 { 00:19:09.262 "dma_device_id": "system", 00:19:09.262 "dma_device_type": 1 00:19:09.262 }, 00:19:09.262 { 00:19:09.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.262 "dma_device_type": 2 00:19:09.262 } 00:19:09.262 ], 00:19:09.262 "driver_specific": {} 00:19:09.262 }' 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.262 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:09.520 22:26:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.777 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.777 "name": "BaseBdev4", 00:19:09.777 "aliases": [ 00:19:09.777 "06bfaeca-85f5-4b21-b56d-7783ebf7b245" 00:19:09.777 ], 00:19:09.777 "product_name": "Malloc disk", 00:19:09.777 "block_size": 512, 00:19:09.777 "num_blocks": 65536, 00:19:09.777 "uuid": "06bfaeca-85f5-4b21-b56d-7783ebf7b245", 00:19:09.777 "assigned_rate_limits": { 00:19:09.777 "rw_ios_per_sec": 0, 00:19:09.777 "rw_mbytes_per_sec": 0, 00:19:09.777 "r_mbytes_per_sec": 0, 00:19:09.777 "w_mbytes_per_sec": 0 00:19:09.777 }, 00:19:09.777 "claimed": true, 00:19:09.777 "claim_type": "exclusive_write", 00:19:09.777 "zoned": false, 00:19:09.777 "supported_io_types": { 00:19:09.777 "read": true, 00:19:09.777 "write": true, 00:19:09.777 "unmap": true, 00:19:09.777 "flush": true, 00:19:09.777 "reset": true, 00:19:09.777 "nvme_admin": false, 00:19:09.777 "nvme_io": false, 00:19:09.777 "nvme_io_md": false, 00:19:09.777 "write_zeroes": true, 00:19:09.777 "zcopy": true, 00:19:09.777 "get_zone_info": false, 00:19:09.777 "zone_management": false, 00:19:09.777 "zone_append": false, 00:19:09.777 "compare": false, 00:19:09.777 "compare_and_write": false, 00:19:09.777 "abort": true, 00:19:09.778 "seek_hole": false, 00:19:09.778 "seek_data": false, 00:19:09.778 "copy": true, 00:19:09.778 "nvme_iov_md": false 00:19:09.778 }, 00:19:09.778 "memory_domains": [ 00:19:09.778 { 00:19:09.778 "dma_device_id": "system", 00:19:09.778 "dma_device_type": 1 00:19:09.778 }, 00:19:09.778 { 00:19:09.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.778 "dma_device_type": 2 00:19:09.778 } 00:19:09.778 ], 00:19:09.778 "driver_specific": {} 00:19:09.778 }' 00:19:09.778 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.778 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.035 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.292 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.292 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.292 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:10.550 [2024-07-12 22:26:20.639191] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:10.550 [2024-07-12 22:26:20.639219] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:10.550 [2024-07-12 22:26:20.639268] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.550 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.808 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.808 "name": "Existed_Raid", 00:19:10.808 "uuid": "b51a3720-d8a2-4c31-b6bd-462b69e7266e", 00:19:10.808 "strip_size_kb": 64, 00:19:10.808 "state": "offline", 00:19:10.808 "raid_level": "raid0", 00:19:10.808 "superblock": true, 00:19:10.808 "num_base_bdevs": 4, 00:19:10.808 "num_base_bdevs_discovered": 3, 00:19:10.808 "num_base_bdevs_operational": 3, 00:19:10.808 "base_bdevs_list": [ 00:19:10.808 { 00:19:10.808 "name": null, 00:19:10.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.808 "is_configured": false, 00:19:10.808 "data_offset": 2048, 00:19:10.808 "data_size": 63488 00:19:10.808 }, 00:19:10.808 { 00:19:10.808 "name": "BaseBdev2", 00:19:10.808 "uuid": "b6247073-35ab-4985-b858-9bf79609be4f", 00:19:10.808 "is_configured": true, 00:19:10.808 "data_offset": 2048, 00:19:10.808 "data_size": 63488 00:19:10.808 }, 00:19:10.808 { 00:19:10.808 "name": "BaseBdev3", 00:19:10.808 "uuid": "3c44a257-eb77-4db3-b7c7-046663d130e4", 00:19:10.808 "is_configured": true, 00:19:10.808 "data_offset": 2048, 00:19:10.808 "data_size": 63488 00:19:10.808 }, 00:19:10.808 { 00:19:10.808 "name": "BaseBdev4", 00:19:10.808 "uuid": "06bfaeca-85f5-4b21-b56d-7783ebf7b245", 00:19:10.808 "is_configured": true, 00:19:10.808 "data_offset": 2048, 00:19:10.808 "data_size": 63488 00:19:10.808 } 00:19:10.808 ] 00:19:10.808 }' 00:19:10.808 22:26:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.808 22:26:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:11.372 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:11.373 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:11.373 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.373 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:11.630 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:11.630 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:11.630 22:26:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:11.888 [2024-07-12 22:26:21.996692] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:11.888 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:11.888 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:11.888 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.888 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:12.146 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:12.146 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:12.146 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:12.422 [2024-07-12 22:26:22.494434] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:12.422 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:12.422 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:12.422 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.422 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:12.708 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:12.708 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:12.708 22:26:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:12.708 [2024-07-12 22:26:22.987072] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:12.708 [2024-07-12 22:26:22.987117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ccf350 name Existed_Raid, state offline 00:19:12.708 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:12.708 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:12.708 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.708 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:12.970 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:13.227 BaseBdev2 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:13.227 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:13.484 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:13.741 [ 00:19:13.741 { 00:19:13.741 "name": "BaseBdev2", 00:19:13.741 "aliases": [ 00:19:13.741 "d56c1d4f-e315-41da-8808-334a501e98a6" 00:19:13.741 ], 00:19:13.741 "product_name": "Malloc disk", 00:19:13.741 "block_size": 512, 00:19:13.741 "num_blocks": 65536, 00:19:13.741 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:13.741 "assigned_rate_limits": { 00:19:13.741 "rw_ios_per_sec": 0, 00:19:13.741 "rw_mbytes_per_sec": 0, 00:19:13.741 "r_mbytes_per_sec": 0, 00:19:13.741 "w_mbytes_per_sec": 0 00:19:13.741 }, 00:19:13.741 "claimed": false, 00:19:13.741 "zoned": false, 00:19:13.741 "supported_io_types": { 00:19:13.741 "read": true, 00:19:13.741 "write": true, 00:19:13.741 "unmap": true, 00:19:13.741 "flush": true, 00:19:13.741 "reset": true, 00:19:13.741 "nvme_admin": false, 00:19:13.741 "nvme_io": false, 00:19:13.741 "nvme_io_md": false, 00:19:13.741 "write_zeroes": true, 00:19:13.741 "zcopy": true, 00:19:13.741 "get_zone_info": false, 00:19:13.741 "zone_management": false, 00:19:13.741 "zone_append": false, 00:19:13.741 "compare": false, 00:19:13.741 "compare_and_write": false, 00:19:13.741 "abort": true, 00:19:13.741 "seek_hole": false, 00:19:13.741 "seek_data": false, 00:19:13.741 "copy": true, 00:19:13.741 "nvme_iov_md": false 00:19:13.741 }, 00:19:13.741 "memory_domains": [ 00:19:13.741 { 00:19:13.741 "dma_device_id": "system", 00:19:13.741 "dma_device_type": 1 00:19:13.741 }, 00:19:13.741 { 00:19:13.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.741 "dma_device_type": 2 00:19:13.741 } 00:19:13.741 ], 00:19:13.741 "driver_specific": {} 00:19:13.741 } 00:19:13.741 ] 00:19:13.741 22:26:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:13.741 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:13.741 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:13.741 22:26:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:13.998 BaseBdev3 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:13.998 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.256 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:14.514 [ 00:19:14.514 { 00:19:14.514 "name": "BaseBdev3", 00:19:14.514 "aliases": [ 00:19:14.514 "3fdec900-fd7b-493c-a834-38f7649da025" 00:19:14.514 ], 00:19:14.514 "product_name": "Malloc disk", 00:19:14.514 "block_size": 512, 00:19:14.514 "num_blocks": 65536, 00:19:14.514 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:14.514 "assigned_rate_limits": { 00:19:14.514 "rw_ios_per_sec": 0, 00:19:14.514 "rw_mbytes_per_sec": 0, 00:19:14.514 "r_mbytes_per_sec": 0, 00:19:14.514 "w_mbytes_per_sec": 0 00:19:14.514 }, 00:19:14.514 "claimed": false, 00:19:14.514 "zoned": false, 00:19:14.514 "supported_io_types": { 00:19:14.514 "read": true, 00:19:14.514 "write": true, 00:19:14.514 "unmap": true, 00:19:14.514 "flush": true, 00:19:14.514 "reset": true, 00:19:14.514 "nvme_admin": false, 00:19:14.514 "nvme_io": false, 00:19:14.514 "nvme_io_md": false, 00:19:14.514 "write_zeroes": true, 00:19:14.514 "zcopy": true, 00:19:14.514 "get_zone_info": false, 00:19:14.514 "zone_management": false, 00:19:14.514 "zone_append": false, 00:19:14.514 "compare": false, 00:19:14.514 "compare_and_write": false, 00:19:14.514 "abort": true, 00:19:14.514 "seek_hole": false, 00:19:14.514 "seek_data": false, 00:19:14.514 "copy": true, 00:19:14.514 "nvme_iov_md": false 00:19:14.514 }, 00:19:14.514 "memory_domains": [ 00:19:14.514 { 00:19:14.514 "dma_device_id": "system", 00:19:14.514 "dma_device_type": 1 00:19:14.514 }, 00:19:14.514 { 00:19:14.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.514 "dma_device_type": 2 00:19:14.514 } 00:19:14.514 ], 00:19:14.514 "driver_specific": {} 00:19:14.514 } 00:19:14.514 ] 00:19:14.514 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:14.514 22:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:14.514 22:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:14.514 22:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:14.771 BaseBdev4 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:14.771 22:26:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.029 22:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:15.287 [ 00:19:15.287 { 00:19:15.287 "name": "BaseBdev4", 00:19:15.287 "aliases": [ 00:19:15.287 "a1ae25c0-beed-4103-a5fc-f030ad84bbeb" 00:19:15.287 ], 00:19:15.287 "product_name": "Malloc disk", 00:19:15.287 "block_size": 512, 00:19:15.287 "num_blocks": 65536, 00:19:15.287 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:15.287 "assigned_rate_limits": { 00:19:15.287 "rw_ios_per_sec": 0, 00:19:15.287 "rw_mbytes_per_sec": 0, 00:19:15.287 "r_mbytes_per_sec": 0, 00:19:15.287 "w_mbytes_per_sec": 0 00:19:15.287 }, 00:19:15.287 "claimed": false, 00:19:15.287 "zoned": false, 00:19:15.287 "supported_io_types": { 00:19:15.287 "read": true, 00:19:15.287 "write": true, 00:19:15.287 "unmap": true, 00:19:15.287 "flush": true, 00:19:15.287 "reset": true, 00:19:15.287 "nvme_admin": false, 00:19:15.287 "nvme_io": false, 00:19:15.287 "nvme_io_md": false, 00:19:15.287 "write_zeroes": true, 00:19:15.287 "zcopy": true, 00:19:15.287 "get_zone_info": false, 00:19:15.287 "zone_management": false, 00:19:15.287 "zone_append": false, 00:19:15.287 "compare": false, 00:19:15.287 "compare_and_write": false, 00:19:15.287 "abort": true, 00:19:15.287 "seek_hole": false, 00:19:15.287 "seek_data": false, 00:19:15.287 "copy": true, 00:19:15.287 "nvme_iov_md": false 00:19:15.287 }, 00:19:15.287 "memory_domains": [ 00:19:15.287 { 00:19:15.287 "dma_device_id": "system", 00:19:15.287 "dma_device_type": 1 00:19:15.287 }, 00:19:15.287 { 00:19:15.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.287 "dma_device_type": 2 00:19:15.287 } 00:19:15.287 ], 00:19:15.287 "driver_specific": {} 00:19:15.287 } 00:19:15.287 ] 00:19:15.287 22:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:15.287 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:15.287 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:15.287 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:15.545 [2024-07-12 22:26:25.695643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:15.545 [2024-07-12 22:26:25.695683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:15.545 [2024-07-12 22:26:25.695703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:15.545 [2024-07-12 22:26:25.697054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:15.545 [2024-07-12 22:26:25.697097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.545 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.803 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.803 "name": "Existed_Raid", 00:19:15.803 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:15.803 "strip_size_kb": 64, 00:19:15.803 "state": "configuring", 00:19:15.803 "raid_level": "raid0", 00:19:15.803 "superblock": true, 00:19:15.803 "num_base_bdevs": 4, 00:19:15.803 "num_base_bdevs_discovered": 3, 00:19:15.803 "num_base_bdevs_operational": 4, 00:19:15.803 "base_bdevs_list": [ 00:19:15.803 { 00:19:15.803 "name": "BaseBdev1", 00:19:15.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.803 "is_configured": false, 00:19:15.803 "data_offset": 0, 00:19:15.803 "data_size": 0 00:19:15.803 }, 00:19:15.803 { 00:19:15.803 "name": "BaseBdev2", 00:19:15.803 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:15.803 "is_configured": true, 00:19:15.803 "data_offset": 2048, 00:19:15.803 "data_size": 63488 00:19:15.803 }, 00:19:15.803 { 00:19:15.803 "name": "BaseBdev3", 00:19:15.803 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:15.803 "is_configured": true, 00:19:15.803 "data_offset": 2048, 00:19:15.803 "data_size": 63488 00:19:15.803 }, 00:19:15.803 { 00:19:15.803 "name": "BaseBdev4", 00:19:15.803 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:15.803 "is_configured": true, 00:19:15.803 "data_offset": 2048, 00:19:15.803 "data_size": 63488 00:19:15.803 } 00:19:15.803 ] 00:19:15.803 }' 00:19:15.803 22:26:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.803 22:26:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.367 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:16.625 [2024-07-12 22:26:26.762445] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.625 22:26:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.883 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.883 "name": "Existed_Raid", 00:19:16.883 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:16.883 "strip_size_kb": 64, 00:19:16.883 "state": "configuring", 00:19:16.883 "raid_level": "raid0", 00:19:16.883 "superblock": true, 00:19:16.883 "num_base_bdevs": 4, 00:19:16.883 "num_base_bdevs_discovered": 2, 00:19:16.883 "num_base_bdevs_operational": 4, 00:19:16.883 "base_bdevs_list": [ 00:19:16.883 { 00:19:16.883 "name": "BaseBdev1", 00:19:16.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.883 "is_configured": false, 00:19:16.883 "data_offset": 0, 00:19:16.883 "data_size": 0 00:19:16.883 }, 00:19:16.883 { 00:19:16.883 "name": null, 00:19:16.883 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:16.883 "is_configured": false, 00:19:16.883 "data_offset": 2048, 00:19:16.883 "data_size": 63488 00:19:16.883 }, 00:19:16.883 { 00:19:16.883 "name": "BaseBdev3", 00:19:16.883 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:16.883 "is_configured": true, 00:19:16.883 "data_offset": 2048, 00:19:16.883 "data_size": 63488 00:19:16.883 }, 00:19:16.883 { 00:19:16.883 "name": "BaseBdev4", 00:19:16.883 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:16.883 "is_configured": true, 00:19:16.883 "data_offset": 2048, 00:19:16.883 "data_size": 63488 00:19:16.883 } 00:19:16.883 ] 00:19:16.883 }' 00:19:16.883 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.883 22:26:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.448 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:17.448 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.705 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:17.705 22:26:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:17.963 [2024-07-12 22:26:28.134746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.963 BaseBdev1 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.963 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.220 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:18.478 [ 00:19:18.478 { 00:19:18.478 "name": "BaseBdev1", 00:19:18.478 "aliases": [ 00:19:18.478 "da342e40-6d20-4354-af23-0cdea1ad01eb" 00:19:18.478 ], 00:19:18.478 "product_name": "Malloc disk", 00:19:18.478 "block_size": 512, 00:19:18.478 "num_blocks": 65536, 00:19:18.478 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:18.478 "assigned_rate_limits": { 00:19:18.478 "rw_ios_per_sec": 0, 00:19:18.478 "rw_mbytes_per_sec": 0, 00:19:18.478 "r_mbytes_per_sec": 0, 00:19:18.478 "w_mbytes_per_sec": 0 00:19:18.478 }, 00:19:18.478 "claimed": true, 00:19:18.478 "claim_type": "exclusive_write", 00:19:18.478 "zoned": false, 00:19:18.478 "supported_io_types": { 00:19:18.478 "read": true, 00:19:18.478 "write": true, 00:19:18.478 "unmap": true, 00:19:18.478 "flush": true, 00:19:18.478 "reset": true, 00:19:18.478 "nvme_admin": false, 00:19:18.478 "nvme_io": false, 00:19:18.478 "nvme_io_md": false, 00:19:18.478 "write_zeroes": true, 00:19:18.478 "zcopy": true, 00:19:18.478 "get_zone_info": false, 00:19:18.478 "zone_management": false, 00:19:18.478 "zone_append": false, 00:19:18.478 "compare": false, 00:19:18.478 "compare_and_write": false, 00:19:18.478 "abort": true, 00:19:18.478 "seek_hole": false, 00:19:18.478 "seek_data": false, 00:19:18.478 "copy": true, 00:19:18.478 "nvme_iov_md": false 00:19:18.478 }, 00:19:18.478 "memory_domains": [ 00:19:18.478 { 00:19:18.478 "dma_device_id": "system", 00:19:18.478 "dma_device_type": 1 00:19:18.478 }, 00:19:18.478 { 00:19:18.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.478 "dma_device_type": 2 00:19:18.478 } 00:19:18.478 ], 00:19:18.478 "driver_specific": {} 00:19:18.478 } 00:19:18.478 ] 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.478 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.735 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.735 "name": "Existed_Raid", 00:19:18.735 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:18.735 "strip_size_kb": 64, 00:19:18.735 "state": "configuring", 00:19:18.735 "raid_level": "raid0", 00:19:18.735 "superblock": true, 00:19:18.735 "num_base_bdevs": 4, 00:19:18.735 "num_base_bdevs_discovered": 3, 00:19:18.735 "num_base_bdevs_operational": 4, 00:19:18.735 "base_bdevs_list": [ 00:19:18.735 { 00:19:18.735 "name": "BaseBdev1", 00:19:18.735 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:18.735 "is_configured": true, 00:19:18.735 "data_offset": 2048, 00:19:18.735 "data_size": 63488 00:19:18.735 }, 00:19:18.735 { 00:19:18.735 "name": null, 00:19:18.735 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:18.735 "is_configured": false, 00:19:18.735 "data_offset": 2048, 00:19:18.735 "data_size": 63488 00:19:18.735 }, 00:19:18.735 { 00:19:18.735 "name": "BaseBdev3", 00:19:18.735 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:18.735 "is_configured": true, 00:19:18.735 "data_offset": 2048, 00:19:18.735 "data_size": 63488 00:19:18.735 }, 00:19:18.735 { 00:19:18.735 "name": "BaseBdev4", 00:19:18.735 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:18.735 "is_configured": true, 00:19:18.735 "data_offset": 2048, 00:19:18.735 "data_size": 63488 00:19:18.735 } 00:19:18.735 ] 00:19:18.735 }' 00:19:18.735 22:26:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.736 22:26:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.300 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.300 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:19.557 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:19.557 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:19.814 [2024-07-12 22:26:29.887417] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.814 22:26:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.071 22:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.071 "name": "Existed_Raid", 00:19:20.071 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:20.071 "strip_size_kb": 64, 00:19:20.071 "state": "configuring", 00:19:20.071 "raid_level": "raid0", 00:19:20.071 "superblock": true, 00:19:20.071 "num_base_bdevs": 4, 00:19:20.071 "num_base_bdevs_discovered": 2, 00:19:20.071 "num_base_bdevs_operational": 4, 00:19:20.071 "base_bdevs_list": [ 00:19:20.071 { 00:19:20.071 "name": "BaseBdev1", 00:19:20.071 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:20.071 "is_configured": true, 00:19:20.071 "data_offset": 2048, 00:19:20.071 "data_size": 63488 00:19:20.071 }, 00:19:20.071 { 00:19:20.071 "name": null, 00:19:20.071 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:20.071 "is_configured": false, 00:19:20.071 "data_offset": 2048, 00:19:20.071 "data_size": 63488 00:19:20.071 }, 00:19:20.071 { 00:19:20.071 "name": null, 00:19:20.071 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:20.071 "is_configured": false, 00:19:20.071 "data_offset": 2048, 00:19:20.071 "data_size": 63488 00:19:20.071 }, 00:19:20.071 { 00:19:20.071 "name": "BaseBdev4", 00:19:20.071 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:20.071 "is_configured": true, 00:19:20.071 "data_offset": 2048, 00:19:20.071 "data_size": 63488 00:19:20.071 } 00:19:20.071 ] 00:19:20.071 }' 00:19:20.071 22:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.071 22:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.633 22:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.633 22:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:20.890 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:20.890 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:21.147 [2024-07-12 22:26:31.247036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.147 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.403 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.403 "name": "Existed_Raid", 00:19:21.403 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:21.403 "strip_size_kb": 64, 00:19:21.403 "state": "configuring", 00:19:21.403 "raid_level": "raid0", 00:19:21.403 "superblock": true, 00:19:21.403 "num_base_bdevs": 4, 00:19:21.403 "num_base_bdevs_discovered": 3, 00:19:21.403 "num_base_bdevs_operational": 4, 00:19:21.403 "base_bdevs_list": [ 00:19:21.403 { 00:19:21.403 "name": "BaseBdev1", 00:19:21.403 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:21.403 "is_configured": true, 00:19:21.403 "data_offset": 2048, 00:19:21.403 "data_size": 63488 00:19:21.403 }, 00:19:21.403 { 00:19:21.403 "name": null, 00:19:21.403 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:21.403 "is_configured": false, 00:19:21.403 "data_offset": 2048, 00:19:21.403 "data_size": 63488 00:19:21.403 }, 00:19:21.403 { 00:19:21.403 "name": "BaseBdev3", 00:19:21.403 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:21.403 "is_configured": true, 00:19:21.403 "data_offset": 2048, 00:19:21.403 "data_size": 63488 00:19:21.403 }, 00:19:21.403 { 00:19:21.403 "name": "BaseBdev4", 00:19:21.403 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:21.403 "is_configured": true, 00:19:21.403 "data_offset": 2048, 00:19:21.403 "data_size": 63488 00:19:21.403 } 00:19:21.403 ] 00:19:21.403 }' 00:19:21.403 22:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.403 22:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.965 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.965 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:22.222 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:22.222 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:22.478 [2024-07-12 22:26:32.590627] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.478 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.735 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.735 "name": "Existed_Raid", 00:19:22.735 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:22.735 "strip_size_kb": 64, 00:19:22.735 "state": "configuring", 00:19:22.735 "raid_level": "raid0", 00:19:22.735 "superblock": true, 00:19:22.735 "num_base_bdevs": 4, 00:19:22.735 "num_base_bdevs_discovered": 2, 00:19:22.735 "num_base_bdevs_operational": 4, 00:19:22.735 "base_bdevs_list": [ 00:19:22.735 { 00:19:22.735 "name": null, 00:19:22.735 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:22.735 "is_configured": false, 00:19:22.735 "data_offset": 2048, 00:19:22.735 "data_size": 63488 00:19:22.735 }, 00:19:22.735 { 00:19:22.735 "name": null, 00:19:22.735 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:22.735 "is_configured": false, 00:19:22.735 "data_offset": 2048, 00:19:22.735 "data_size": 63488 00:19:22.735 }, 00:19:22.735 { 00:19:22.735 "name": "BaseBdev3", 00:19:22.735 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:22.735 "is_configured": true, 00:19:22.735 "data_offset": 2048, 00:19:22.735 "data_size": 63488 00:19:22.735 }, 00:19:22.735 { 00:19:22.735 "name": "BaseBdev4", 00:19:22.735 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:22.735 "is_configured": true, 00:19:22.735 "data_offset": 2048, 00:19:22.735 "data_size": 63488 00:19:22.735 } 00:19:22.735 ] 00:19:22.735 }' 00:19:22.735 22:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.735 22:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.666 22:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:23.666 22:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.666 22:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:23.923 22:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:23.923 [2024-07-12 22:26:34.217656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.923 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.181 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.181 "name": "Existed_Raid", 00:19:24.181 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:24.181 "strip_size_kb": 64, 00:19:24.181 "state": "configuring", 00:19:24.181 "raid_level": "raid0", 00:19:24.181 "superblock": true, 00:19:24.181 "num_base_bdevs": 4, 00:19:24.181 "num_base_bdevs_discovered": 3, 00:19:24.181 "num_base_bdevs_operational": 4, 00:19:24.181 "base_bdevs_list": [ 00:19:24.181 { 00:19:24.181 "name": null, 00:19:24.181 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:24.181 "is_configured": false, 00:19:24.181 "data_offset": 2048, 00:19:24.181 "data_size": 63488 00:19:24.181 }, 00:19:24.181 { 00:19:24.181 "name": "BaseBdev2", 00:19:24.181 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:24.181 "is_configured": true, 00:19:24.181 "data_offset": 2048, 00:19:24.181 "data_size": 63488 00:19:24.181 }, 00:19:24.181 { 00:19:24.181 "name": "BaseBdev3", 00:19:24.181 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:24.181 "is_configured": true, 00:19:24.181 "data_offset": 2048, 00:19:24.181 "data_size": 63488 00:19:24.181 }, 00:19:24.181 { 00:19:24.181 "name": "BaseBdev4", 00:19:24.181 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:24.181 "is_configured": true, 00:19:24.181 "data_offset": 2048, 00:19:24.181 "data_size": 63488 00:19:24.181 } 00:19:24.181 ] 00:19:24.181 }' 00:19:24.181 22:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.181 22:26:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:25.113 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.113 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:25.113 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:25.113 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.113 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u da342e40-6d20-4354-af23-0cdea1ad01eb 00:19:25.371 [2024-07-12 22:26:35.669000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:25.371 [2024-07-12 22:26:35.669157] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd5470 00:19:25.371 [2024-07-12 22:26:35.669170] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:25.371 [2024-07-12 22:26:35.669354] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc5c40 00:19:25.371 [2024-07-12 22:26:35.669471] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd5470 00:19:25.371 [2024-07-12 22:26:35.669481] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cd5470 00:19:25.371 [2024-07-12 22:26:35.669572] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:25.371 NewBaseBdev 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:25.371 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:25.629 22:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:25.887 [ 00:19:25.887 { 00:19:25.887 "name": "NewBaseBdev", 00:19:25.887 "aliases": [ 00:19:25.887 "da342e40-6d20-4354-af23-0cdea1ad01eb" 00:19:25.887 ], 00:19:25.887 "product_name": "Malloc disk", 00:19:25.887 "block_size": 512, 00:19:25.887 "num_blocks": 65536, 00:19:25.887 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:25.887 "assigned_rate_limits": { 00:19:25.887 "rw_ios_per_sec": 0, 00:19:25.887 "rw_mbytes_per_sec": 0, 00:19:25.887 "r_mbytes_per_sec": 0, 00:19:25.887 "w_mbytes_per_sec": 0 00:19:25.887 }, 00:19:25.887 "claimed": true, 00:19:25.887 "claim_type": "exclusive_write", 00:19:25.887 "zoned": false, 00:19:25.887 "supported_io_types": { 00:19:25.887 "read": true, 00:19:25.887 "write": true, 00:19:25.887 "unmap": true, 00:19:25.887 "flush": true, 00:19:25.887 "reset": true, 00:19:25.887 "nvme_admin": false, 00:19:25.887 "nvme_io": false, 00:19:25.887 "nvme_io_md": false, 00:19:25.887 "write_zeroes": true, 00:19:25.887 "zcopy": true, 00:19:25.887 "get_zone_info": false, 00:19:25.887 "zone_management": false, 00:19:25.887 "zone_append": false, 00:19:25.887 "compare": false, 00:19:25.887 "compare_and_write": false, 00:19:25.887 "abort": true, 00:19:25.887 "seek_hole": false, 00:19:25.887 "seek_data": false, 00:19:25.887 "copy": true, 00:19:25.887 "nvme_iov_md": false 00:19:25.887 }, 00:19:25.887 "memory_domains": [ 00:19:25.887 { 00:19:25.887 "dma_device_id": "system", 00:19:25.887 "dma_device_type": 1 00:19:25.887 }, 00:19:25.887 { 00:19:25.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.887 "dma_device_type": 2 00:19:25.887 } 00:19:25.887 ], 00:19:25.887 "driver_specific": {} 00:19:25.887 } 00:19:25.887 ] 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.887 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.145 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.145 "name": "Existed_Raid", 00:19:26.145 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:26.145 "strip_size_kb": 64, 00:19:26.145 "state": "online", 00:19:26.145 "raid_level": "raid0", 00:19:26.145 "superblock": true, 00:19:26.145 "num_base_bdevs": 4, 00:19:26.145 "num_base_bdevs_discovered": 4, 00:19:26.145 "num_base_bdevs_operational": 4, 00:19:26.145 "base_bdevs_list": [ 00:19:26.145 { 00:19:26.145 "name": "NewBaseBdev", 00:19:26.145 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:26.145 "is_configured": true, 00:19:26.145 "data_offset": 2048, 00:19:26.145 "data_size": 63488 00:19:26.145 }, 00:19:26.145 { 00:19:26.145 "name": "BaseBdev2", 00:19:26.145 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:26.145 "is_configured": true, 00:19:26.145 "data_offset": 2048, 00:19:26.145 "data_size": 63488 00:19:26.145 }, 00:19:26.145 { 00:19:26.145 "name": "BaseBdev3", 00:19:26.145 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:26.145 "is_configured": true, 00:19:26.145 "data_offset": 2048, 00:19:26.145 "data_size": 63488 00:19:26.145 }, 00:19:26.145 { 00:19:26.145 "name": "BaseBdev4", 00:19:26.145 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:26.145 "is_configured": true, 00:19:26.145 "data_offset": 2048, 00:19:26.145 "data_size": 63488 00:19:26.145 } 00:19:26.145 ] 00:19:26.145 }' 00:19:26.145 22:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.145 22:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:27.077 [2024-07-12 22:26:37.269614] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:27.077 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:27.077 "name": "Existed_Raid", 00:19:27.077 "aliases": [ 00:19:27.077 "905b5834-12c8-48f5-94e3-819417ea17f3" 00:19:27.077 ], 00:19:27.077 "product_name": "Raid Volume", 00:19:27.077 "block_size": 512, 00:19:27.077 "num_blocks": 253952, 00:19:27.077 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:27.077 "assigned_rate_limits": { 00:19:27.077 "rw_ios_per_sec": 0, 00:19:27.077 "rw_mbytes_per_sec": 0, 00:19:27.077 "r_mbytes_per_sec": 0, 00:19:27.077 "w_mbytes_per_sec": 0 00:19:27.077 }, 00:19:27.077 "claimed": false, 00:19:27.077 "zoned": false, 00:19:27.077 "supported_io_types": { 00:19:27.077 "read": true, 00:19:27.077 "write": true, 00:19:27.077 "unmap": true, 00:19:27.077 "flush": true, 00:19:27.077 "reset": true, 00:19:27.077 "nvme_admin": false, 00:19:27.077 "nvme_io": false, 00:19:27.077 "nvme_io_md": false, 00:19:27.077 "write_zeroes": true, 00:19:27.077 "zcopy": false, 00:19:27.077 "get_zone_info": false, 00:19:27.077 "zone_management": false, 00:19:27.077 "zone_append": false, 00:19:27.077 "compare": false, 00:19:27.077 "compare_and_write": false, 00:19:27.077 "abort": false, 00:19:27.077 "seek_hole": false, 00:19:27.077 "seek_data": false, 00:19:27.077 "copy": false, 00:19:27.077 "nvme_iov_md": false 00:19:27.077 }, 00:19:27.077 "memory_domains": [ 00:19:27.077 { 00:19:27.077 "dma_device_id": "system", 00:19:27.077 "dma_device_type": 1 00:19:27.077 }, 00:19:27.077 { 00:19:27.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.077 "dma_device_type": 2 00:19:27.077 }, 00:19:27.077 { 00:19:27.077 "dma_device_id": "system", 00:19:27.077 "dma_device_type": 1 00:19:27.077 }, 00:19:27.077 { 00:19:27.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.077 "dma_device_type": 2 00:19:27.077 }, 00:19:27.077 { 00:19:27.078 "dma_device_id": "system", 00:19:27.078 "dma_device_type": 1 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.078 "dma_device_type": 2 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "dma_device_id": "system", 00:19:27.078 "dma_device_type": 1 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.078 "dma_device_type": 2 00:19:27.078 } 00:19:27.078 ], 00:19:27.078 "driver_specific": { 00:19:27.078 "raid": { 00:19:27.078 "uuid": "905b5834-12c8-48f5-94e3-819417ea17f3", 00:19:27.078 "strip_size_kb": 64, 00:19:27.078 "state": "online", 00:19:27.078 "raid_level": "raid0", 00:19:27.078 "superblock": true, 00:19:27.078 "num_base_bdevs": 4, 00:19:27.078 "num_base_bdevs_discovered": 4, 00:19:27.078 "num_base_bdevs_operational": 4, 00:19:27.078 "base_bdevs_list": [ 00:19:27.078 { 00:19:27.078 "name": "NewBaseBdev", 00:19:27.078 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:27.078 "is_configured": true, 00:19:27.078 "data_offset": 2048, 00:19:27.078 "data_size": 63488 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "name": "BaseBdev2", 00:19:27.078 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:27.078 "is_configured": true, 00:19:27.078 "data_offset": 2048, 00:19:27.078 "data_size": 63488 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "name": "BaseBdev3", 00:19:27.078 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:27.078 "is_configured": true, 00:19:27.078 "data_offset": 2048, 00:19:27.078 "data_size": 63488 00:19:27.078 }, 00:19:27.078 { 00:19:27.078 "name": "BaseBdev4", 00:19:27.078 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:27.078 "is_configured": true, 00:19:27.078 "data_offset": 2048, 00:19:27.078 "data_size": 63488 00:19:27.078 } 00:19:27.078 ] 00:19:27.078 } 00:19:27.078 } 00:19:27.078 }' 00:19:27.078 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:27.078 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:27.078 BaseBdev2 00:19:27.078 BaseBdev3 00:19:27.078 BaseBdev4' 00:19:27.078 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:27.078 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:27.078 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:27.335 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:27.335 "name": "NewBaseBdev", 00:19:27.335 "aliases": [ 00:19:27.335 "da342e40-6d20-4354-af23-0cdea1ad01eb" 00:19:27.335 ], 00:19:27.335 "product_name": "Malloc disk", 00:19:27.335 "block_size": 512, 00:19:27.335 "num_blocks": 65536, 00:19:27.335 "uuid": "da342e40-6d20-4354-af23-0cdea1ad01eb", 00:19:27.335 "assigned_rate_limits": { 00:19:27.335 "rw_ios_per_sec": 0, 00:19:27.335 "rw_mbytes_per_sec": 0, 00:19:27.335 "r_mbytes_per_sec": 0, 00:19:27.335 "w_mbytes_per_sec": 0 00:19:27.335 }, 00:19:27.335 "claimed": true, 00:19:27.335 "claim_type": "exclusive_write", 00:19:27.335 "zoned": false, 00:19:27.335 "supported_io_types": { 00:19:27.335 "read": true, 00:19:27.335 "write": true, 00:19:27.335 "unmap": true, 00:19:27.335 "flush": true, 00:19:27.335 "reset": true, 00:19:27.335 "nvme_admin": false, 00:19:27.335 "nvme_io": false, 00:19:27.335 "nvme_io_md": false, 00:19:27.335 "write_zeroes": true, 00:19:27.335 "zcopy": true, 00:19:27.335 "get_zone_info": false, 00:19:27.335 "zone_management": false, 00:19:27.335 "zone_append": false, 00:19:27.335 "compare": false, 00:19:27.335 "compare_and_write": false, 00:19:27.335 "abort": true, 00:19:27.335 "seek_hole": false, 00:19:27.335 "seek_data": false, 00:19:27.335 "copy": true, 00:19:27.335 "nvme_iov_md": false 00:19:27.335 }, 00:19:27.335 "memory_domains": [ 00:19:27.335 { 00:19:27.335 "dma_device_id": "system", 00:19:27.335 "dma_device_type": 1 00:19:27.335 }, 00:19:27.335 { 00:19:27.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.335 "dma_device_type": 2 00:19:27.335 } 00:19:27.335 ], 00:19:27.335 "driver_specific": {} 00:19:27.335 }' 00:19:27.335 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.335 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.593 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.851 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:27.851 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:27.851 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:27.851 22:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:28.108 "name": "BaseBdev2", 00:19:28.108 "aliases": [ 00:19:28.108 "d56c1d4f-e315-41da-8808-334a501e98a6" 00:19:28.108 ], 00:19:28.108 "product_name": "Malloc disk", 00:19:28.108 "block_size": 512, 00:19:28.108 "num_blocks": 65536, 00:19:28.108 "uuid": "d56c1d4f-e315-41da-8808-334a501e98a6", 00:19:28.108 "assigned_rate_limits": { 00:19:28.108 "rw_ios_per_sec": 0, 00:19:28.108 "rw_mbytes_per_sec": 0, 00:19:28.108 "r_mbytes_per_sec": 0, 00:19:28.108 "w_mbytes_per_sec": 0 00:19:28.108 }, 00:19:28.108 "claimed": true, 00:19:28.108 "claim_type": "exclusive_write", 00:19:28.108 "zoned": false, 00:19:28.108 "supported_io_types": { 00:19:28.108 "read": true, 00:19:28.108 "write": true, 00:19:28.108 "unmap": true, 00:19:28.108 "flush": true, 00:19:28.108 "reset": true, 00:19:28.108 "nvme_admin": false, 00:19:28.108 "nvme_io": false, 00:19:28.108 "nvme_io_md": false, 00:19:28.108 "write_zeroes": true, 00:19:28.108 "zcopy": true, 00:19:28.108 "get_zone_info": false, 00:19:28.108 "zone_management": false, 00:19:28.108 "zone_append": false, 00:19:28.108 "compare": false, 00:19:28.108 "compare_and_write": false, 00:19:28.108 "abort": true, 00:19:28.108 "seek_hole": false, 00:19:28.108 "seek_data": false, 00:19:28.108 "copy": true, 00:19:28.108 "nvme_iov_md": false 00:19:28.108 }, 00:19:28.108 "memory_domains": [ 00:19:28.108 { 00:19:28.108 "dma_device_id": "system", 00:19:28.108 "dma_device_type": 1 00:19:28.108 }, 00:19:28.108 { 00:19:28.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.108 "dma_device_type": 2 00:19:28.108 } 00:19:28.108 ], 00:19:28.108 "driver_specific": {} 00:19:28.108 }' 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.108 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:28.365 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:28.622 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:28.622 "name": "BaseBdev3", 00:19:28.622 "aliases": [ 00:19:28.622 "3fdec900-fd7b-493c-a834-38f7649da025" 00:19:28.622 ], 00:19:28.622 "product_name": "Malloc disk", 00:19:28.622 "block_size": 512, 00:19:28.622 "num_blocks": 65536, 00:19:28.622 "uuid": "3fdec900-fd7b-493c-a834-38f7649da025", 00:19:28.622 "assigned_rate_limits": { 00:19:28.622 "rw_ios_per_sec": 0, 00:19:28.622 "rw_mbytes_per_sec": 0, 00:19:28.622 "r_mbytes_per_sec": 0, 00:19:28.622 "w_mbytes_per_sec": 0 00:19:28.622 }, 00:19:28.622 "claimed": true, 00:19:28.622 "claim_type": "exclusive_write", 00:19:28.622 "zoned": false, 00:19:28.622 "supported_io_types": { 00:19:28.622 "read": true, 00:19:28.622 "write": true, 00:19:28.622 "unmap": true, 00:19:28.622 "flush": true, 00:19:28.622 "reset": true, 00:19:28.622 "nvme_admin": false, 00:19:28.622 "nvme_io": false, 00:19:28.622 "nvme_io_md": false, 00:19:28.622 "write_zeroes": true, 00:19:28.622 "zcopy": true, 00:19:28.622 "get_zone_info": false, 00:19:28.623 "zone_management": false, 00:19:28.623 "zone_append": false, 00:19:28.623 "compare": false, 00:19:28.623 "compare_and_write": false, 00:19:28.623 "abort": true, 00:19:28.623 "seek_hole": false, 00:19:28.623 "seek_data": false, 00:19:28.623 "copy": true, 00:19:28.623 "nvme_iov_md": false 00:19:28.623 }, 00:19:28.623 "memory_domains": [ 00:19:28.623 { 00:19:28.623 "dma_device_id": "system", 00:19:28.623 "dma_device_type": 1 00:19:28.623 }, 00:19:28.623 { 00:19:28.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.623 "dma_device_type": 2 00:19:28.623 } 00:19:28.623 ], 00:19:28.623 "driver_specific": {} 00:19:28.623 }' 00:19:28.623 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.623 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.623 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:28.623 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.623 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.905 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:28.905 22:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:28.905 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.177 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.177 "name": "BaseBdev4", 00:19:29.177 "aliases": [ 00:19:29.177 "a1ae25c0-beed-4103-a5fc-f030ad84bbeb" 00:19:29.177 ], 00:19:29.177 "product_name": "Malloc disk", 00:19:29.178 "block_size": 512, 00:19:29.178 "num_blocks": 65536, 00:19:29.178 "uuid": "a1ae25c0-beed-4103-a5fc-f030ad84bbeb", 00:19:29.178 "assigned_rate_limits": { 00:19:29.178 "rw_ios_per_sec": 0, 00:19:29.178 "rw_mbytes_per_sec": 0, 00:19:29.178 "r_mbytes_per_sec": 0, 00:19:29.178 "w_mbytes_per_sec": 0 00:19:29.178 }, 00:19:29.178 "claimed": true, 00:19:29.178 "claim_type": "exclusive_write", 00:19:29.178 "zoned": false, 00:19:29.178 "supported_io_types": { 00:19:29.178 "read": true, 00:19:29.178 "write": true, 00:19:29.178 "unmap": true, 00:19:29.178 "flush": true, 00:19:29.178 "reset": true, 00:19:29.178 "nvme_admin": false, 00:19:29.178 "nvme_io": false, 00:19:29.178 "nvme_io_md": false, 00:19:29.178 "write_zeroes": true, 00:19:29.178 "zcopy": true, 00:19:29.178 "get_zone_info": false, 00:19:29.178 "zone_management": false, 00:19:29.178 "zone_append": false, 00:19:29.178 "compare": false, 00:19:29.178 "compare_and_write": false, 00:19:29.178 "abort": true, 00:19:29.178 "seek_hole": false, 00:19:29.178 "seek_data": false, 00:19:29.178 "copy": true, 00:19:29.178 "nvme_iov_md": false 00:19:29.178 }, 00:19:29.178 "memory_domains": [ 00:19:29.178 { 00:19:29.178 "dma_device_id": "system", 00:19:29.178 "dma_device_type": 1 00:19:29.178 }, 00:19:29.178 { 00:19:29.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.178 "dma_device_type": 2 00:19:29.178 } 00:19:29.178 ], 00:19:29.178 "driver_specific": {} 00:19:29.178 }' 00:19:29.178 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.178 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.178 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.178 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.435 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.436 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:29.694 [2024-07-12 22:26:39.980502] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:29.694 [2024-07-12 22:26:39.980531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:29.694 [2024-07-12 22:26:39.980596] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.694 [2024-07-12 22:26:39.980663] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.694 [2024-07-12 22:26:39.980676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd5470 name Existed_Raid, state offline 00:19:29.694 22:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3486555 00:19:29.694 22:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3486555 ']' 00:19:29.694 22:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3486555 00:19:29.694 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:29.694 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:29.694 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3486555 00:19:29.951 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:29.951 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:29.951 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3486555' 00:19:29.951 killing process with pid 3486555 00:19:29.951 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3486555 00:19:29.951 [2024-07-12 22:26:40.052507] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:29.951 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3486555 00:19:29.951 [2024-07-12 22:26:40.091822] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:30.209 22:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:30.209 00:19:30.209 real 0m32.709s 00:19:30.209 user 1m0.054s 00:19:30.209 sys 0m5.835s 00:19:30.209 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:30.209 22:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:30.209 ************************************ 00:19:30.209 END TEST raid_state_function_test_sb 00:19:30.209 ************************************ 00:19:30.209 22:26:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:30.209 22:26:40 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:30.209 22:26:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:30.209 22:26:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:30.209 22:26:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:30.209 ************************************ 00:19:30.209 START TEST raid_superblock_test 00:19:30.209 ************************************ 00:19:30.209 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:19:30.209 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:19:30.209 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:30.209 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:30.209 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3491582 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3491582 /var/tmp/spdk-raid.sock 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3491582 ']' 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:30.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:30.210 22:26:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.210 [2024-07-12 22:26:40.454505] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:19:30.210 [2024-07-12 22:26:40.454566] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3491582 ] 00:19:30.468 [2024-07-12 22:26:40.575497] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:30.468 [2024-07-12 22:26:40.678634] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:30.468 [2024-07-12 22:26:40.736458] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:30.468 [2024-07-12 22:26:40.736484] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:31.402 malloc1 00:19:31.402 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:31.659 [2024-07-12 22:26:41.876742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:31.659 [2024-07-12 22:26:41.876791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.659 [2024-07-12 22:26:41.876814] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bb570 00:19:31.659 [2024-07-12 22:26:41.876827] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.659 [2024-07-12 22:26:41.878584] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.659 [2024-07-12 22:26:41.878613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:31.659 pt1 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:31.659 22:26:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:31.917 malloc2 00:19:31.917 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:32.175 [2024-07-12 22:26:42.376090] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:32.175 [2024-07-12 22:26:42.376136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:32.175 [2024-07-12 22:26:42.376156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bc970 00:19:32.175 [2024-07-12 22:26:42.376168] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.175 [2024-07-12 22:26:42.377768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.175 [2024-07-12 22:26:42.377796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:32.175 pt2 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:32.175 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:32.433 malloc3 00:19:32.433 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:32.691 [2024-07-12 22:26:42.861961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:32.691 [2024-07-12 22:26:42.862011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:32.691 [2024-07-12 22:26:42.862031] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b53340 00:19:32.691 [2024-07-12 22:26:42.862044] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.691 [2024-07-12 22:26:42.863619] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.691 [2024-07-12 22:26:42.863647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:32.691 pt3 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:32.691 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:32.692 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:32.692 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:32.692 22:26:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:32.950 malloc4 00:19:32.950 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:33.208 [2024-07-12 22:26:43.356346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:33.208 [2024-07-12 22:26:43.356394] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.208 [2024-07-12 22:26:43.356416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b55c60 00:19:33.208 [2024-07-12 22:26:43.356429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.208 [2024-07-12 22:26:43.358035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.208 [2024-07-12 22:26:43.358061] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:33.208 pt4 00:19:33.208 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:33.208 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:33.208 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:33.466 [2024-07-12 22:26:43.605039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:33.466 [2024-07-12 22:26:43.606241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:33.466 [2024-07-12 22:26:43.606296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:33.466 [2024-07-12 22:26:43.606340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:33.466 [2024-07-12 22:26:43.606509] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b3530 00:19:33.466 [2024-07-12 22:26:43.606521] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:33.466 [2024-07-12 22:26:43.606708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b1770 00:19:33.466 [2024-07-12 22:26:43.606854] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b3530 00:19:33.466 [2024-07-12 22:26:43.606864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b3530 00:19:33.466 [2024-07-12 22:26:43.606971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.466 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.467 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.467 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.725 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.725 "name": "raid_bdev1", 00:19:33.725 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:33.725 "strip_size_kb": 64, 00:19:33.725 "state": "online", 00:19:33.725 "raid_level": "raid0", 00:19:33.725 "superblock": true, 00:19:33.725 "num_base_bdevs": 4, 00:19:33.725 "num_base_bdevs_discovered": 4, 00:19:33.725 "num_base_bdevs_operational": 4, 00:19:33.725 "base_bdevs_list": [ 00:19:33.725 { 00:19:33.725 "name": "pt1", 00:19:33.725 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:33.725 "is_configured": true, 00:19:33.725 "data_offset": 2048, 00:19:33.725 "data_size": 63488 00:19:33.725 }, 00:19:33.725 { 00:19:33.725 "name": "pt2", 00:19:33.725 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:33.725 "is_configured": true, 00:19:33.725 "data_offset": 2048, 00:19:33.725 "data_size": 63488 00:19:33.725 }, 00:19:33.725 { 00:19:33.725 "name": "pt3", 00:19:33.725 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:33.725 "is_configured": true, 00:19:33.725 "data_offset": 2048, 00:19:33.725 "data_size": 63488 00:19:33.725 }, 00:19:33.725 { 00:19:33.725 "name": "pt4", 00:19:33.725 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:33.725 "is_configured": true, 00:19:33.725 "data_offset": 2048, 00:19:33.725 "data_size": 63488 00:19:33.725 } 00:19:33.725 ] 00:19:33.725 }' 00:19:33.725 22:26:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.725 22:26:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:34.292 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:34.551 [2024-07-12 22:26:44.684159] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:34.551 "name": "raid_bdev1", 00:19:34.551 "aliases": [ 00:19:34.551 "74a76d5a-1313-4293-96ed-de26a648df9c" 00:19:34.551 ], 00:19:34.551 "product_name": "Raid Volume", 00:19:34.551 "block_size": 512, 00:19:34.551 "num_blocks": 253952, 00:19:34.551 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:34.551 "assigned_rate_limits": { 00:19:34.551 "rw_ios_per_sec": 0, 00:19:34.551 "rw_mbytes_per_sec": 0, 00:19:34.551 "r_mbytes_per_sec": 0, 00:19:34.551 "w_mbytes_per_sec": 0 00:19:34.551 }, 00:19:34.551 "claimed": false, 00:19:34.551 "zoned": false, 00:19:34.551 "supported_io_types": { 00:19:34.551 "read": true, 00:19:34.551 "write": true, 00:19:34.551 "unmap": true, 00:19:34.551 "flush": true, 00:19:34.551 "reset": true, 00:19:34.551 "nvme_admin": false, 00:19:34.551 "nvme_io": false, 00:19:34.551 "nvme_io_md": false, 00:19:34.551 "write_zeroes": true, 00:19:34.551 "zcopy": false, 00:19:34.551 "get_zone_info": false, 00:19:34.551 "zone_management": false, 00:19:34.551 "zone_append": false, 00:19:34.551 "compare": false, 00:19:34.551 "compare_and_write": false, 00:19:34.551 "abort": false, 00:19:34.551 "seek_hole": false, 00:19:34.551 "seek_data": false, 00:19:34.551 "copy": false, 00:19:34.551 "nvme_iov_md": false 00:19:34.551 }, 00:19:34.551 "memory_domains": [ 00:19:34.551 { 00:19:34.551 "dma_device_id": "system", 00:19:34.551 "dma_device_type": 1 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.551 "dma_device_type": 2 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "system", 00:19:34.551 "dma_device_type": 1 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.551 "dma_device_type": 2 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "system", 00:19:34.551 "dma_device_type": 1 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.551 "dma_device_type": 2 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "system", 00:19:34.551 "dma_device_type": 1 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.551 "dma_device_type": 2 00:19:34.551 } 00:19:34.551 ], 00:19:34.551 "driver_specific": { 00:19:34.551 "raid": { 00:19:34.551 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:34.551 "strip_size_kb": 64, 00:19:34.551 "state": "online", 00:19:34.551 "raid_level": "raid0", 00:19:34.551 "superblock": true, 00:19:34.551 "num_base_bdevs": 4, 00:19:34.551 "num_base_bdevs_discovered": 4, 00:19:34.551 "num_base_bdevs_operational": 4, 00:19:34.551 "base_bdevs_list": [ 00:19:34.551 { 00:19:34.551 "name": "pt1", 00:19:34.551 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:34.551 "is_configured": true, 00:19:34.551 "data_offset": 2048, 00:19:34.551 "data_size": 63488 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "name": "pt2", 00:19:34.551 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:34.551 "is_configured": true, 00:19:34.551 "data_offset": 2048, 00:19:34.551 "data_size": 63488 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "name": "pt3", 00:19:34.551 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:34.551 "is_configured": true, 00:19:34.551 "data_offset": 2048, 00:19:34.551 "data_size": 63488 00:19:34.551 }, 00:19:34.551 { 00:19:34.551 "name": "pt4", 00:19:34.551 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:34.551 "is_configured": true, 00:19:34.551 "data_offset": 2048, 00:19:34.551 "data_size": 63488 00:19:34.551 } 00:19:34.551 ] 00:19:34.551 } 00:19:34.551 } 00:19:34.551 }' 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:34.551 pt2 00:19:34.551 pt3 00:19:34.551 pt4' 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:34.551 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:34.810 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:34.810 "name": "pt1", 00:19:34.810 "aliases": [ 00:19:34.810 "00000000-0000-0000-0000-000000000001" 00:19:34.810 ], 00:19:34.810 "product_name": "passthru", 00:19:34.810 "block_size": 512, 00:19:34.810 "num_blocks": 65536, 00:19:34.810 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:34.810 "assigned_rate_limits": { 00:19:34.810 "rw_ios_per_sec": 0, 00:19:34.810 "rw_mbytes_per_sec": 0, 00:19:34.810 "r_mbytes_per_sec": 0, 00:19:34.810 "w_mbytes_per_sec": 0 00:19:34.810 }, 00:19:34.810 "claimed": true, 00:19:34.810 "claim_type": "exclusive_write", 00:19:34.810 "zoned": false, 00:19:34.810 "supported_io_types": { 00:19:34.810 "read": true, 00:19:34.810 "write": true, 00:19:34.810 "unmap": true, 00:19:34.810 "flush": true, 00:19:34.810 "reset": true, 00:19:34.810 "nvme_admin": false, 00:19:34.810 "nvme_io": false, 00:19:34.810 "nvme_io_md": false, 00:19:34.810 "write_zeroes": true, 00:19:34.810 "zcopy": true, 00:19:34.810 "get_zone_info": false, 00:19:34.810 "zone_management": false, 00:19:34.810 "zone_append": false, 00:19:34.810 "compare": false, 00:19:34.810 "compare_and_write": false, 00:19:34.810 "abort": true, 00:19:34.810 "seek_hole": false, 00:19:34.810 "seek_data": false, 00:19:34.810 "copy": true, 00:19:34.810 "nvme_iov_md": false 00:19:34.810 }, 00:19:34.810 "memory_domains": [ 00:19:34.810 { 00:19:34.810 "dma_device_id": "system", 00:19:34.810 "dma_device_type": 1 00:19:34.810 }, 00:19:34.810 { 00:19:34.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.810 "dma_device_type": 2 00:19:34.810 } 00:19:34.810 ], 00:19:34.810 "driver_specific": { 00:19:34.810 "passthru": { 00:19:34.810 "name": "pt1", 00:19:34.810 "base_bdev_name": "malloc1" 00:19:34.810 } 00:19:34.810 } 00:19:34.810 }' 00:19:34.810 22:26:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.810 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.810 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:34.810 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.068 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:35.069 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.327 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.327 "name": "pt2", 00:19:35.327 "aliases": [ 00:19:35.327 "00000000-0000-0000-0000-000000000002" 00:19:35.327 ], 00:19:35.327 "product_name": "passthru", 00:19:35.327 "block_size": 512, 00:19:35.327 "num_blocks": 65536, 00:19:35.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:35.327 "assigned_rate_limits": { 00:19:35.327 "rw_ios_per_sec": 0, 00:19:35.327 "rw_mbytes_per_sec": 0, 00:19:35.327 "r_mbytes_per_sec": 0, 00:19:35.327 "w_mbytes_per_sec": 0 00:19:35.327 }, 00:19:35.327 "claimed": true, 00:19:35.327 "claim_type": "exclusive_write", 00:19:35.327 "zoned": false, 00:19:35.327 "supported_io_types": { 00:19:35.327 "read": true, 00:19:35.327 "write": true, 00:19:35.327 "unmap": true, 00:19:35.327 "flush": true, 00:19:35.327 "reset": true, 00:19:35.327 "nvme_admin": false, 00:19:35.327 "nvme_io": false, 00:19:35.327 "nvme_io_md": false, 00:19:35.327 "write_zeroes": true, 00:19:35.327 "zcopy": true, 00:19:35.327 "get_zone_info": false, 00:19:35.327 "zone_management": false, 00:19:35.327 "zone_append": false, 00:19:35.327 "compare": false, 00:19:35.327 "compare_and_write": false, 00:19:35.327 "abort": true, 00:19:35.327 "seek_hole": false, 00:19:35.327 "seek_data": false, 00:19:35.327 "copy": true, 00:19:35.327 "nvme_iov_md": false 00:19:35.327 }, 00:19:35.327 "memory_domains": [ 00:19:35.327 { 00:19:35.327 "dma_device_id": "system", 00:19:35.327 "dma_device_type": 1 00:19:35.327 }, 00:19:35.327 { 00:19:35.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.327 "dma_device_type": 2 00:19:35.327 } 00:19:35.327 ], 00:19:35.327 "driver_specific": { 00:19:35.327 "passthru": { 00:19:35.327 "name": "pt2", 00:19:35.327 "base_bdev_name": "malloc2" 00:19:35.327 } 00:19:35.327 } 00:19:35.327 }' 00:19:35.327 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.327 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.584 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.842 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.842 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.842 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.842 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:35.842 22:26:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.100 "name": "pt3", 00:19:36.100 "aliases": [ 00:19:36.100 "00000000-0000-0000-0000-000000000003" 00:19:36.100 ], 00:19:36.100 "product_name": "passthru", 00:19:36.100 "block_size": 512, 00:19:36.100 "num_blocks": 65536, 00:19:36.100 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:36.100 "assigned_rate_limits": { 00:19:36.100 "rw_ios_per_sec": 0, 00:19:36.100 "rw_mbytes_per_sec": 0, 00:19:36.100 "r_mbytes_per_sec": 0, 00:19:36.100 "w_mbytes_per_sec": 0 00:19:36.100 }, 00:19:36.100 "claimed": true, 00:19:36.100 "claim_type": "exclusive_write", 00:19:36.100 "zoned": false, 00:19:36.100 "supported_io_types": { 00:19:36.100 "read": true, 00:19:36.100 "write": true, 00:19:36.100 "unmap": true, 00:19:36.100 "flush": true, 00:19:36.100 "reset": true, 00:19:36.100 "nvme_admin": false, 00:19:36.100 "nvme_io": false, 00:19:36.100 "nvme_io_md": false, 00:19:36.100 "write_zeroes": true, 00:19:36.100 "zcopy": true, 00:19:36.100 "get_zone_info": false, 00:19:36.100 "zone_management": false, 00:19:36.100 "zone_append": false, 00:19:36.100 "compare": false, 00:19:36.100 "compare_and_write": false, 00:19:36.100 "abort": true, 00:19:36.100 "seek_hole": false, 00:19:36.100 "seek_data": false, 00:19:36.100 "copy": true, 00:19:36.100 "nvme_iov_md": false 00:19:36.100 }, 00:19:36.100 "memory_domains": [ 00:19:36.100 { 00:19:36.100 "dma_device_id": "system", 00:19:36.100 "dma_device_type": 1 00:19:36.100 }, 00:19:36.100 { 00:19:36.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.100 "dma_device_type": 2 00:19:36.100 } 00:19:36.100 ], 00:19:36.100 "driver_specific": { 00:19:36.100 "passthru": { 00:19:36.100 "name": "pt3", 00:19:36.100 "base_bdev_name": "malloc3" 00:19:36.100 } 00:19:36.100 } 00:19:36.100 }' 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.100 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:36.359 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.617 "name": "pt4", 00:19:36.617 "aliases": [ 00:19:36.617 "00000000-0000-0000-0000-000000000004" 00:19:36.617 ], 00:19:36.617 "product_name": "passthru", 00:19:36.617 "block_size": 512, 00:19:36.617 "num_blocks": 65536, 00:19:36.617 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:36.617 "assigned_rate_limits": { 00:19:36.617 "rw_ios_per_sec": 0, 00:19:36.617 "rw_mbytes_per_sec": 0, 00:19:36.617 "r_mbytes_per_sec": 0, 00:19:36.617 "w_mbytes_per_sec": 0 00:19:36.617 }, 00:19:36.617 "claimed": true, 00:19:36.617 "claim_type": "exclusive_write", 00:19:36.617 "zoned": false, 00:19:36.617 "supported_io_types": { 00:19:36.617 "read": true, 00:19:36.617 "write": true, 00:19:36.617 "unmap": true, 00:19:36.617 "flush": true, 00:19:36.617 "reset": true, 00:19:36.617 "nvme_admin": false, 00:19:36.617 "nvme_io": false, 00:19:36.617 "nvme_io_md": false, 00:19:36.617 "write_zeroes": true, 00:19:36.617 "zcopy": true, 00:19:36.617 "get_zone_info": false, 00:19:36.617 "zone_management": false, 00:19:36.617 "zone_append": false, 00:19:36.617 "compare": false, 00:19:36.617 "compare_and_write": false, 00:19:36.617 "abort": true, 00:19:36.617 "seek_hole": false, 00:19:36.617 "seek_data": false, 00:19:36.617 "copy": true, 00:19:36.617 "nvme_iov_md": false 00:19:36.617 }, 00:19:36.617 "memory_domains": [ 00:19:36.617 { 00:19:36.617 "dma_device_id": "system", 00:19:36.617 "dma_device_type": 1 00:19:36.617 }, 00:19:36.617 { 00:19:36.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.617 "dma_device_type": 2 00:19:36.617 } 00:19:36.617 ], 00:19:36.617 "driver_specific": { 00:19:36.617 "passthru": { 00:19:36.617 "name": "pt4", 00:19:36.617 "base_bdev_name": "malloc4" 00:19:36.617 } 00:19:36.617 } 00:19:36.617 }' 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.617 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.875 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.875 22:26:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:36.875 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:37.134 [2024-07-12 22:26:47.379292] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:37.134 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=74a76d5a-1313-4293-96ed-de26a648df9c 00:19:37.134 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 74a76d5a-1313-4293-96ed-de26a648df9c ']' 00:19:37.134 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:37.392 [2024-07-12 22:26:47.623637] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:37.392 [2024-07-12 22:26:47.623663] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:37.392 [2024-07-12 22:26:47.623718] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:37.392 [2024-07-12 22:26:47.623781] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:37.392 [2024-07-12 22:26:47.623793] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b3530 name raid_bdev1, state offline 00:19:37.392 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.392 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:37.651 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:37.651 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:37.651 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.651 22:26:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:37.910 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.910 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:38.168 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:38.168 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:38.427 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:38.427 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:38.685 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:38.685 22:26:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:38.943 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:38.943 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:38.943 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:38.943 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:38.944 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:39.203 [2024-07-12 22:26:49.328064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:39.203 [2024-07-12 22:26:49.329448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:39.203 [2024-07-12 22:26:49.329491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:39.203 [2024-07-12 22:26:49.329526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:39.203 [2024-07-12 22:26:49.329573] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:39.203 [2024-07-12 22:26:49.329611] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:39.203 [2024-07-12 22:26:49.329634] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:39.203 [2024-07-12 22:26:49.329656] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:39.203 [2024-07-12 22:26:49.329675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:39.203 [2024-07-12 22:26:49.329686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b5eff0 name raid_bdev1, state configuring 00:19:39.203 request: 00:19:39.203 { 00:19:39.203 "name": "raid_bdev1", 00:19:39.203 "raid_level": "raid0", 00:19:39.203 "base_bdevs": [ 00:19:39.203 "malloc1", 00:19:39.203 "malloc2", 00:19:39.203 "malloc3", 00:19:39.203 "malloc4" 00:19:39.203 ], 00:19:39.203 "strip_size_kb": 64, 00:19:39.203 "superblock": false, 00:19:39.203 "method": "bdev_raid_create", 00:19:39.203 "req_id": 1 00:19:39.203 } 00:19:39.203 Got JSON-RPC error response 00:19:39.203 response: 00:19:39.203 { 00:19:39.203 "code": -17, 00:19:39.203 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:39.203 } 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.203 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:39.462 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:39.462 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:39.462 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:39.719 [2024-07-12 22:26:49.801248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:39.719 [2024-07-12 22:26:49.801295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.719 [2024-07-12 22:26:49.801317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bb7a0 00:19:39.719 [2024-07-12 22:26:49.801329] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.719 [2024-07-12 22:26:49.802938] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.719 [2024-07-12 22:26:49.802966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:39.719 [2024-07-12 22:26:49.803035] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:39.719 [2024-07-12 22:26:49.803063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:39.719 pt1 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.719 22:26:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:39.978 22:26:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.978 "name": "raid_bdev1", 00:19:39.978 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:39.978 "strip_size_kb": 64, 00:19:39.978 "state": "configuring", 00:19:39.978 "raid_level": "raid0", 00:19:39.978 "superblock": true, 00:19:39.978 "num_base_bdevs": 4, 00:19:39.978 "num_base_bdevs_discovered": 1, 00:19:39.978 "num_base_bdevs_operational": 4, 00:19:39.978 "base_bdevs_list": [ 00:19:39.978 { 00:19:39.978 "name": "pt1", 00:19:39.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:39.978 "is_configured": true, 00:19:39.978 "data_offset": 2048, 00:19:39.978 "data_size": 63488 00:19:39.978 }, 00:19:39.978 { 00:19:39.978 "name": null, 00:19:39.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:39.978 "is_configured": false, 00:19:39.978 "data_offset": 2048, 00:19:39.978 "data_size": 63488 00:19:39.978 }, 00:19:39.978 { 00:19:39.978 "name": null, 00:19:39.978 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:39.978 "is_configured": false, 00:19:39.978 "data_offset": 2048, 00:19:39.978 "data_size": 63488 00:19:39.978 }, 00:19:39.978 { 00:19:39.978 "name": null, 00:19:39.978 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:39.978 "is_configured": false, 00:19:39.978 "data_offset": 2048, 00:19:39.978 "data_size": 63488 00:19:39.978 } 00:19:39.978 ] 00:19:39.978 }' 00:19:39.978 22:26:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.978 22:26:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.912 22:26:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:40.912 22:26:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:40.912 [2024-07-12 22:26:51.088677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:40.912 [2024-07-12 22:26:51.088724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.912 [2024-07-12 22:26:51.088746] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b54940 00:19:40.912 [2024-07-12 22:26:51.088758] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.912 [2024-07-12 22:26:51.089122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.913 [2024-07-12 22:26:51.089141] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:40.913 [2024-07-12 22:26:51.089205] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:40.913 [2024-07-12 22:26:51.089225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:40.913 pt2 00:19:40.913 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:41.171 [2024-07-12 22:26:51.273173] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.171 "name": "raid_bdev1", 00:19:41.171 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:41.171 "strip_size_kb": 64, 00:19:41.171 "state": "configuring", 00:19:41.171 "raid_level": "raid0", 00:19:41.171 "superblock": true, 00:19:41.171 "num_base_bdevs": 4, 00:19:41.171 "num_base_bdevs_discovered": 1, 00:19:41.171 "num_base_bdevs_operational": 4, 00:19:41.171 "base_bdevs_list": [ 00:19:41.171 { 00:19:41.171 "name": "pt1", 00:19:41.171 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:41.171 "is_configured": true, 00:19:41.171 "data_offset": 2048, 00:19:41.171 "data_size": 63488 00:19:41.171 }, 00:19:41.171 { 00:19:41.171 "name": null, 00:19:41.171 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:41.171 "is_configured": false, 00:19:41.171 "data_offset": 2048, 00:19:41.171 "data_size": 63488 00:19:41.171 }, 00:19:41.171 { 00:19:41.171 "name": null, 00:19:41.171 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:41.171 "is_configured": false, 00:19:41.171 "data_offset": 2048, 00:19:41.171 "data_size": 63488 00:19:41.171 }, 00:19:41.171 { 00:19:41.171 "name": null, 00:19:41.171 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:41.171 "is_configured": false, 00:19:41.171 "data_offset": 2048, 00:19:41.171 "data_size": 63488 00:19:41.171 } 00:19:41.171 ] 00:19:41.171 }' 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.171 22:26:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:42.104 [2024-07-12 22:26:52.291874] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:42.104 [2024-07-12 22:26:52.291922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.104 [2024-07-12 22:26:52.291950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b2060 00:19:42.104 [2024-07-12 22:26:52.291970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.104 [2024-07-12 22:26:52.292315] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.104 [2024-07-12 22:26:52.292334] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:42.104 [2024-07-12 22:26:52.292401] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:42.104 [2024-07-12 22:26:52.292421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:42.104 pt2 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:42.104 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:42.363 [2024-07-12 22:26:52.536536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:42.363 [2024-07-12 22:26:52.536578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.363 [2024-07-12 22:26:52.536599] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b48d0 00:19:42.363 [2024-07-12 22:26:52.536611] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.363 [2024-07-12 22:26:52.536938] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.363 [2024-07-12 22:26:52.536956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:42.363 [2024-07-12 22:26:52.537014] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:42.363 [2024-07-12 22:26:52.537033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:42.363 pt3 00:19:42.363 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:42.363 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:42.363 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:42.621 [2024-07-12 22:26:52.769155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:42.621 [2024-07-12 22:26:52.769190] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.621 [2024-07-12 22:26:52.769207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b5b80 00:19:42.622 [2024-07-12 22:26:52.769219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.622 [2024-07-12 22:26:52.769521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.622 [2024-07-12 22:26:52.769539] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:42.622 [2024-07-12 22:26:52.769593] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:42.622 [2024-07-12 22:26:52.769612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:42.622 [2024-07-12 22:26:52.769734] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b2780 00:19:42.622 [2024-07-12 22:26:52.769744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:42.622 [2024-07-12 22:26:52.769913] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b7d70 00:19:42.622 [2024-07-12 22:26:52.770055] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b2780 00:19:42.622 [2024-07-12 22:26:52.770065] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b2780 00:19:42.622 [2024-07-12 22:26:52.770164] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:42.622 pt4 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.622 22:26:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.879 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.879 "name": "raid_bdev1", 00:19:42.879 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:42.879 "strip_size_kb": 64, 00:19:42.879 "state": "online", 00:19:42.879 "raid_level": "raid0", 00:19:42.879 "superblock": true, 00:19:42.879 "num_base_bdevs": 4, 00:19:42.879 "num_base_bdevs_discovered": 4, 00:19:42.879 "num_base_bdevs_operational": 4, 00:19:42.879 "base_bdevs_list": [ 00:19:42.879 { 00:19:42.879 "name": "pt1", 00:19:42.879 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.879 "is_configured": true, 00:19:42.879 "data_offset": 2048, 00:19:42.880 "data_size": 63488 00:19:42.880 }, 00:19:42.880 { 00:19:42.880 "name": "pt2", 00:19:42.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:42.880 "is_configured": true, 00:19:42.880 "data_offset": 2048, 00:19:42.880 "data_size": 63488 00:19:42.880 }, 00:19:42.880 { 00:19:42.880 "name": "pt3", 00:19:42.880 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.880 "is_configured": true, 00:19:42.880 "data_offset": 2048, 00:19:42.880 "data_size": 63488 00:19:42.880 }, 00:19:42.880 { 00:19:42.880 "name": "pt4", 00:19:42.880 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.880 "is_configured": true, 00:19:42.880 "data_offset": 2048, 00:19:42.880 "data_size": 63488 00:19:42.880 } 00:19:42.880 ] 00:19:42.880 }' 00:19:42.880 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.880 22:26:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:43.445 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:43.704 [2024-07-12 22:26:53.840427] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:43.704 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:43.704 "name": "raid_bdev1", 00:19:43.704 "aliases": [ 00:19:43.704 "74a76d5a-1313-4293-96ed-de26a648df9c" 00:19:43.704 ], 00:19:43.704 "product_name": "Raid Volume", 00:19:43.704 "block_size": 512, 00:19:43.704 "num_blocks": 253952, 00:19:43.704 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:43.704 "assigned_rate_limits": { 00:19:43.704 "rw_ios_per_sec": 0, 00:19:43.704 "rw_mbytes_per_sec": 0, 00:19:43.704 "r_mbytes_per_sec": 0, 00:19:43.704 "w_mbytes_per_sec": 0 00:19:43.704 }, 00:19:43.704 "claimed": false, 00:19:43.704 "zoned": false, 00:19:43.704 "supported_io_types": { 00:19:43.704 "read": true, 00:19:43.704 "write": true, 00:19:43.704 "unmap": true, 00:19:43.704 "flush": true, 00:19:43.704 "reset": true, 00:19:43.704 "nvme_admin": false, 00:19:43.704 "nvme_io": false, 00:19:43.704 "nvme_io_md": false, 00:19:43.704 "write_zeroes": true, 00:19:43.704 "zcopy": false, 00:19:43.704 "get_zone_info": false, 00:19:43.704 "zone_management": false, 00:19:43.704 "zone_append": false, 00:19:43.704 "compare": false, 00:19:43.704 "compare_and_write": false, 00:19:43.704 "abort": false, 00:19:43.704 "seek_hole": false, 00:19:43.704 "seek_data": false, 00:19:43.704 "copy": false, 00:19:43.704 "nvme_iov_md": false 00:19:43.704 }, 00:19:43.704 "memory_domains": [ 00:19:43.704 { 00:19:43.704 "dma_device_id": "system", 00:19:43.704 "dma_device_type": 1 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.704 "dma_device_type": 2 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "system", 00:19:43.704 "dma_device_type": 1 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.704 "dma_device_type": 2 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "system", 00:19:43.704 "dma_device_type": 1 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.704 "dma_device_type": 2 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "system", 00:19:43.704 "dma_device_type": 1 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.704 "dma_device_type": 2 00:19:43.704 } 00:19:43.704 ], 00:19:43.704 "driver_specific": { 00:19:43.704 "raid": { 00:19:43.704 "uuid": "74a76d5a-1313-4293-96ed-de26a648df9c", 00:19:43.704 "strip_size_kb": 64, 00:19:43.704 "state": "online", 00:19:43.704 "raid_level": "raid0", 00:19:43.704 "superblock": true, 00:19:43.704 "num_base_bdevs": 4, 00:19:43.704 "num_base_bdevs_discovered": 4, 00:19:43.704 "num_base_bdevs_operational": 4, 00:19:43.704 "base_bdevs_list": [ 00:19:43.704 { 00:19:43.704 "name": "pt1", 00:19:43.704 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.704 "is_configured": true, 00:19:43.704 "data_offset": 2048, 00:19:43.704 "data_size": 63488 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "name": "pt2", 00:19:43.704 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.704 "is_configured": true, 00:19:43.704 "data_offset": 2048, 00:19:43.704 "data_size": 63488 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "name": "pt3", 00:19:43.704 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:43.704 "is_configured": true, 00:19:43.704 "data_offset": 2048, 00:19:43.704 "data_size": 63488 00:19:43.704 }, 00:19:43.704 { 00:19:43.704 "name": "pt4", 00:19:43.704 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:43.704 "is_configured": true, 00:19:43.704 "data_offset": 2048, 00:19:43.704 "data_size": 63488 00:19:43.704 } 00:19:43.704 ] 00:19:43.704 } 00:19:43.704 } 00:19:43.705 }' 00:19:43.705 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:43.705 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:43.705 pt2 00:19:43.705 pt3 00:19:43.705 pt4' 00:19:43.705 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.705 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:43.705 22:26:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.963 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.963 "name": "pt1", 00:19:43.963 "aliases": [ 00:19:43.963 "00000000-0000-0000-0000-000000000001" 00:19:43.963 ], 00:19:43.963 "product_name": "passthru", 00:19:43.963 "block_size": 512, 00:19:43.963 "num_blocks": 65536, 00:19:43.963 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.963 "assigned_rate_limits": { 00:19:43.963 "rw_ios_per_sec": 0, 00:19:43.963 "rw_mbytes_per_sec": 0, 00:19:43.963 "r_mbytes_per_sec": 0, 00:19:43.963 "w_mbytes_per_sec": 0 00:19:43.963 }, 00:19:43.963 "claimed": true, 00:19:43.963 "claim_type": "exclusive_write", 00:19:43.963 "zoned": false, 00:19:43.963 "supported_io_types": { 00:19:43.963 "read": true, 00:19:43.963 "write": true, 00:19:43.963 "unmap": true, 00:19:43.963 "flush": true, 00:19:43.963 "reset": true, 00:19:43.963 "nvme_admin": false, 00:19:43.963 "nvme_io": false, 00:19:43.963 "nvme_io_md": false, 00:19:43.963 "write_zeroes": true, 00:19:43.963 "zcopy": true, 00:19:43.963 "get_zone_info": false, 00:19:43.963 "zone_management": false, 00:19:43.963 "zone_append": false, 00:19:43.963 "compare": false, 00:19:43.963 "compare_and_write": false, 00:19:43.963 "abort": true, 00:19:43.963 "seek_hole": false, 00:19:43.963 "seek_data": false, 00:19:43.963 "copy": true, 00:19:43.963 "nvme_iov_md": false 00:19:43.963 }, 00:19:43.963 "memory_domains": [ 00:19:43.963 { 00:19:43.963 "dma_device_id": "system", 00:19:43.963 "dma_device_type": 1 00:19:43.963 }, 00:19:43.963 { 00:19:43.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.963 "dma_device_type": 2 00:19:43.963 } 00:19:43.963 ], 00:19:43.963 "driver_specific": { 00:19:43.963 "passthru": { 00:19:43.963 "name": "pt1", 00:19:43.963 "base_bdev_name": "malloc1" 00:19:43.963 } 00:19:43.963 } 00:19:43.964 }' 00:19:43.964 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.964 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.964 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.964 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.964 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.221 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.221 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:44.222 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.485 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.485 "name": "pt2", 00:19:44.485 "aliases": [ 00:19:44.485 "00000000-0000-0000-0000-000000000002" 00:19:44.485 ], 00:19:44.485 "product_name": "passthru", 00:19:44.485 "block_size": 512, 00:19:44.485 "num_blocks": 65536, 00:19:44.485 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:44.485 "assigned_rate_limits": { 00:19:44.485 "rw_ios_per_sec": 0, 00:19:44.485 "rw_mbytes_per_sec": 0, 00:19:44.485 "r_mbytes_per_sec": 0, 00:19:44.485 "w_mbytes_per_sec": 0 00:19:44.485 }, 00:19:44.485 "claimed": true, 00:19:44.485 "claim_type": "exclusive_write", 00:19:44.485 "zoned": false, 00:19:44.485 "supported_io_types": { 00:19:44.485 "read": true, 00:19:44.485 "write": true, 00:19:44.485 "unmap": true, 00:19:44.485 "flush": true, 00:19:44.485 "reset": true, 00:19:44.485 "nvme_admin": false, 00:19:44.485 "nvme_io": false, 00:19:44.485 "nvme_io_md": false, 00:19:44.485 "write_zeroes": true, 00:19:44.485 "zcopy": true, 00:19:44.485 "get_zone_info": false, 00:19:44.485 "zone_management": false, 00:19:44.485 "zone_append": false, 00:19:44.485 "compare": false, 00:19:44.485 "compare_and_write": false, 00:19:44.485 "abort": true, 00:19:44.485 "seek_hole": false, 00:19:44.485 "seek_data": false, 00:19:44.485 "copy": true, 00:19:44.485 "nvme_iov_md": false 00:19:44.485 }, 00:19:44.485 "memory_domains": [ 00:19:44.485 { 00:19:44.485 "dma_device_id": "system", 00:19:44.485 "dma_device_type": 1 00:19:44.485 }, 00:19:44.485 { 00:19:44.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.485 "dma_device_type": 2 00:19:44.485 } 00:19:44.485 ], 00:19:44.485 "driver_specific": { 00:19:44.485 "passthru": { 00:19:44.485 "name": "pt2", 00:19:44.485 "base_bdev_name": "malloc2" 00:19:44.485 } 00:19:44.485 } 00:19:44.485 }' 00:19:44.485 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.485 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.743 22:26:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.743 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.743 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.002 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:45.002 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:45.002 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:45.002 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:45.260 "name": "pt3", 00:19:45.260 "aliases": [ 00:19:45.260 "00000000-0000-0000-0000-000000000003" 00:19:45.260 ], 00:19:45.260 "product_name": "passthru", 00:19:45.260 "block_size": 512, 00:19:45.260 "num_blocks": 65536, 00:19:45.260 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:45.260 "assigned_rate_limits": { 00:19:45.260 "rw_ios_per_sec": 0, 00:19:45.260 "rw_mbytes_per_sec": 0, 00:19:45.260 "r_mbytes_per_sec": 0, 00:19:45.260 "w_mbytes_per_sec": 0 00:19:45.260 }, 00:19:45.260 "claimed": true, 00:19:45.260 "claim_type": "exclusive_write", 00:19:45.260 "zoned": false, 00:19:45.260 "supported_io_types": { 00:19:45.260 "read": true, 00:19:45.260 "write": true, 00:19:45.260 "unmap": true, 00:19:45.260 "flush": true, 00:19:45.260 "reset": true, 00:19:45.260 "nvme_admin": false, 00:19:45.260 "nvme_io": false, 00:19:45.260 "nvme_io_md": false, 00:19:45.260 "write_zeroes": true, 00:19:45.260 "zcopy": true, 00:19:45.260 "get_zone_info": false, 00:19:45.260 "zone_management": false, 00:19:45.260 "zone_append": false, 00:19:45.260 "compare": false, 00:19:45.260 "compare_and_write": false, 00:19:45.260 "abort": true, 00:19:45.260 "seek_hole": false, 00:19:45.260 "seek_data": false, 00:19:45.260 "copy": true, 00:19:45.260 "nvme_iov_md": false 00:19:45.260 }, 00:19:45.260 "memory_domains": [ 00:19:45.260 { 00:19:45.260 "dma_device_id": "system", 00:19:45.260 "dma_device_type": 1 00:19:45.260 }, 00:19:45.260 { 00:19:45.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.260 "dma_device_type": 2 00:19:45.260 } 00:19:45.260 ], 00:19:45.260 "driver_specific": { 00:19:45.260 "passthru": { 00:19:45.260 "name": "pt3", 00:19:45.260 "base_bdev_name": "malloc3" 00:19:45.260 } 00:19:45.260 } 00:19:45.260 }' 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.260 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.520 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:45.520 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.520 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.521 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:45.521 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:45.521 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:45.521 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:45.795 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:45.795 "name": "pt4", 00:19:45.795 "aliases": [ 00:19:45.795 "00000000-0000-0000-0000-000000000004" 00:19:45.795 ], 00:19:45.795 "product_name": "passthru", 00:19:45.795 "block_size": 512, 00:19:45.795 "num_blocks": 65536, 00:19:45.795 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:45.795 "assigned_rate_limits": { 00:19:45.795 "rw_ios_per_sec": 0, 00:19:45.795 "rw_mbytes_per_sec": 0, 00:19:45.795 "r_mbytes_per_sec": 0, 00:19:45.795 "w_mbytes_per_sec": 0 00:19:45.795 }, 00:19:45.795 "claimed": true, 00:19:45.795 "claim_type": "exclusive_write", 00:19:45.795 "zoned": false, 00:19:45.795 "supported_io_types": { 00:19:45.795 "read": true, 00:19:45.795 "write": true, 00:19:45.796 "unmap": true, 00:19:45.796 "flush": true, 00:19:45.796 "reset": true, 00:19:45.796 "nvme_admin": false, 00:19:45.796 "nvme_io": false, 00:19:45.796 "nvme_io_md": false, 00:19:45.796 "write_zeroes": true, 00:19:45.796 "zcopy": true, 00:19:45.796 "get_zone_info": false, 00:19:45.796 "zone_management": false, 00:19:45.796 "zone_append": false, 00:19:45.796 "compare": false, 00:19:45.796 "compare_and_write": false, 00:19:45.796 "abort": true, 00:19:45.796 "seek_hole": false, 00:19:45.796 "seek_data": false, 00:19:45.796 "copy": true, 00:19:45.796 "nvme_iov_md": false 00:19:45.796 }, 00:19:45.796 "memory_domains": [ 00:19:45.796 { 00:19:45.796 "dma_device_id": "system", 00:19:45.796 "dma_device_type": 1 00:19:45.796 }, 00:19:45.796 { 00:19:45.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.796 "dma_device_type": 2 00:19:45.796 } 00:19:45.796 ], 00:19:45.796 "driver_specific": { 00:19:45.796 "passthru": { 00:19:45.796 "name": "pt4", 00:19:45.796 "base_bdev_name": "malloc4" 00:19:45.796 } 00:19:45.796 } 00:19:45.796 }' 00:19:45.796 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.796 22:26:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.796 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:45.796 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.796 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:46.065 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:46.324 [2024-07-12 22:26:56.523512] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 74a76d5a-1313-4293-96ed-de26a648df9c '!=' 74a76d5a-1313-4293-96ed-de26a648df9c ']' 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3491582 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3491582 ']' 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3491582 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3491582 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3491582' 00:19:46.324 killing process with pid 3491582 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3491582 00:19:46.324 [2024-07-12 22:26:56.595450] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:46.324 [2024-07-12 22:26:56.595512] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:46.324 [2024-07-12 22:26:56.595575] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:46.324 [2024-07-12 22:26:56.595588] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b2780 name raid_bdev1, state offline 00:19:46.324 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3491582 00:19:46.324 [2024-07-12 22:26:56.636325] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:46.583 22:26:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:46.583 00:19:46.583 real 0m16.467s 00:19:46.583 user 0m29.666s 00:19:46.583 sys 0m2.989s 00:19:46.583 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:46.583 22:26:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.583 ************************************ 00:19:46.583 END TEST raid_superblock_test 00:19:46.583 ************************************ 00:19:46.583 22:26:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:46.583 22:26:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:46.583 22:26:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:46.583 22:26:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:46.583 22:26:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:46.841 ************************************ 00:19:46.841 START TEST raid_read_error_test 00:19:46.841 ************************************ 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:46.841 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.t1Aaew1gkL 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3494023 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3494023 /var/tmp/spdk-raid.sock 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3494023 ']' 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:46.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:46.842 22:26:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.842 [2024-07-12 22:26:57.029702] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:19:46.842 [2024-07-12 22:26:57.029778] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3494023 ] 00:19:46.842 [2024-07-12 22:26:57.158191] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:47.100 [2024-07-12 22:26:57.263316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:47.100 [2024-07-12 22:26:57.332067] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:47.100 [2024-07-12 22:26:57.332106] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:47.666 22:26:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:47.666 22:26:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:47.666 22:26:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:47.666 22:26:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:47.924 BaseBdev1_malloc 00:19:47.924 22:26:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:48.182 true 00:19:48.182 22:26:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:48.440 [2024-07-12 22:26:58.686871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:48.440 [2024-07-12 22:26:58.686922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.440 [2024-07-12 22:26:58.686950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de20d0 00:19:48.440 [2024-07-12 22:26:58.686963] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.440 [2024-07-12 22:26:58.688789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.440 [2024-07-12 22:26:58.688819] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:48.440 BaseBdev1 00:19:48.440 22:26:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:48.440 22:26:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:48.698 BaseBdev2_malloc 00:19:48.698 22:26:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:48.956 true 00:19:48.956 22:26:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:49.215 [2024-07-12 22:26:59.425528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:49.215 [2024-07-12 22:26:59.425577] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.215 [2024-07-12 22:26:59.425600] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de6910 00:19:49.215 [2024-07-12 22:26:59.425613] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.215 [2024-07-12 22:26:59.427227] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.215 [2024-07-12 22:26:59.427258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:49.215 BaseBdev2 00:19:49.215 22:26:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:49.215 22:26:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:49.473 BaseBdev3_malloc 00:19:49.473 22:26:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:49.731 true 00:19:49.731 22:26:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:49.989 [2024-07-12 22:27:00.164006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:49.989 [2024-07-12 22:27:00.164059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.989 [2024-07-12 22:27:00.164082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de8bd0 00:19:49.989 [2024-07-12 22:27:00.164095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.989 [2024-07-12 22:27:00.165711] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.989 [2024-07-12 22:27:00.165738] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:49.989 BaseBdev3 00:19:49.989 22:27:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:49.989 22:27:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:50.246 BaseBdev4_malloc 00:19:50.246 22:27:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:50.503 true 00:19:50.503 22:27:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:50.761 [2024-07-12 22:27:00.906653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:50.761 [2024-07-12 22:27:00.906703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.761 [2024-07-12 22:27:00.906726] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de9aa0 00:19:50.761 [2024-07-12 22:27:00.906739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.761 [2024-07-12 22:27:00.908323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.761 [2024-07-12 22:27:00.908350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:50.761 BaseBdev4 00:19:50.761 22:27:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:51.018 [2024-07-12 22:27:01.151333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:51.018 [2024-07-12 22:27:01.152686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:51.018 [2024-07-12 22:27:01.152757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:51.018 [2024-07-12 22:27:01.152818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:51.018 [2024-07-12 22:27:01.153064] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1de3c20 00:19:51.018 [2024-07-12 22:27:01.153076] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:51.018 [2024-07-12 22:27:01.153282] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c38260 00:19:51.018 [2024-07-12 22:27:01.153436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1de3c20 00:19:51.018 [2024-07-12 22:27:01.153446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1de3c20 00:19:51.018 [2024-07-12 22:27:01.153556] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.018 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.276 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.276 "name": "raid_bdev1", 00:19:51.276 "uuid": "60e336e6-ffa6-4737-907b-733f60e6ff4b", 00:19:51.276 "strip_size_kb": 64, 00:19:51.276 "state": "online", 00:19:51.276 "raid_level": "raid0", 00:19:51.276 "superblock": true, 00:19:51.276 "num_base_bdevs": 4, 00:19:51.276 "num_base_bdevs_discovered": 4, 00:19:51.276 "num_base_bdevs_operational": 4, 00:19:51.276 "base_bdevs_list": [ 00:19:51.276 { 00:19:51.276 "name": "BaseBdev1", 00:19:51.276 "uuid": "481034f3-6b55-509e-af02-e87b51db9fb9", 00:19:51.276 "is_configured": true, 00:19:51.276 "data_offset": 2048, 00:19:51.276 "data_size": 63488 00:19:51.276 }, 00:19:51.276 { 00:19:51.276 "name": "BaseBdev2", 00:19:51.276 "uuid": "dffe500d-55fc-53ff-b301-536ab28c21f8", 00:19:51.276 "is_configured": true, 00:19:51.276 "data_offset": 2048, 00:19:51.276 "data_size": 63488 00:19:51.276 }, 00:19:51.276 { 00:19:51.276 "name": "BaseBdev3", 00:19:51.276 "uuid": "33849fb2-96b1-50ff-a89d-b181f6834e52", 00:19:51.276 "is_configured": true, 00:19:51.276 "data_offset": 2048, 00:19:51.276 "data_size": 63488 00:19:51.276 }, 00:19:51.276 { 00:19:51.276 "name": "BaseBdev4", 00:19:51.276 "uuid": "0d3f2ff0-cc2d-51ff-a538-9e62f04c8161", 00:19:51.276 "is_configured": true, 00:19:51.276 "data_offset": 2048, 00:19:51.276 "data_size": 63488 00:19:51.276 } 00:19:51.276 ] 00:19:51.276 }' 00:19:51.276 22:27:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.276 22:27:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.842 22:27:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:51.842 22:27:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:51.842 [2024-07-12 22:27:02.134220] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dd5fc0 00:19:52.776 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.033 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.291 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.291 "name": "raid_bdev1", 00:19:53.291 "uuid": "60e336e6-ffa6-4737-907b-733f60e6ff4b", 00:19:53.291 "strip_size_kb": 64, 00:19:53.291 "state": "online", 00:19:53.291 "raid_level": "raid0", 00:19:53.291 "superblock": true, 00:19:53.291 "num_base_bdevs": 4, 00:19:53.291 "num_base_bdevs_discovered": 4, 00:19:53.291 "num_base_bdevs_operational": 4, 00:19:53.291 "base_bdevs_list": [ 00:19:53.291 { 00:19:53.291 "name": "BaseBdev1", 00:19:53.291 "uuid": "481034f3-6b55-509e-af02-e87b51db9fb9", 00:19:53.291 "is_configured": true, 00:19:53.291 "data_offset": 2048, 00:19:53.291 "data_size": 63488 00:19:53.291 }, 00:19:53.291 { 00:19:53.291 "name": "BaseBdev2", 00:19:53.291 "uuid": "dffe500d-55fc-53ff-b301-536ab28c21f8", 00:19:53.291 "is_configured": true, 00:19:53.291 "data_offset": 2048, 00:19:53.291 "data_size": 63488 00:19:53.291 }, 00:19:53.291 { 00:19:53.291 "name": "BaseBdev3", 00:19:53.291 "uuid": "33849fb2-96b1-50ff-a89d-b181f6834e52", 00:19:53.291 "is_configured": true, 00:19:53.291 "data_offset": 2048, 00:19:53.291 "data_size": 63488 00:19:53.291 }, 00:19:53.291 { 00:19:53.291 "name": "BaseBdev4", 00:19:53.291 "uuid": "0d3f2ff0-cc2d-51ff-a538-9e62f04c8161", 00:19:53.291 "is_configured": true, 00:19:53.291 "data_offset": 2048, 00:19:53.291 "data_size": 63488 00:19:53.291 } 00:19:53.291 ] 00:19:53.291 }' 00:19:53.291 22:27:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.291 22:27:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.856 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:54.113 [2024-07-12 22:27:04.283416] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:54.113 [2024-07-12 22:27:04.283451] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:54.113 [2024-07-12 22:27:04.286614] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.113 [2024-07-12 22:27:04.286654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:54.113 [2024-07-12 22:27:04.286695] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.113 [2024-07-12 22:27:04.286706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de3c20 name raid_bdev1, state offline 00:19:54.113 0 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3494023 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3494023 ']' 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3494023 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3494023 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3494023' 00:19:54.113 killing process with pid 3494023 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3494023 00:19:54.113 [2024-07-12 22:27:04.352000] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:54.113 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3494023 00:19:54.113 [2024-07-12 22:27:04.385895] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.t1Aaew1gkL 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:19:54.370 00:19:54.370 real 0m7.684s 00:19:54.370 user 0m12.233s 00:19:54.370 sys 0m1.414s 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:54.370 22:27:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.370 ************************************ 00:19:54.370 END TEST raid_read_error_test 00:19:54.370 ************************************ 00:19:54.370 22:27:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:54.370 22:27:04 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:54.370 22:27:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:54.370 22:27:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:54.370 22:27:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:54.628 ************************************ 00:19:54.628 START TEST raid_write_error_test 00:19:54.628 ************************************ 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Y1N1VklKd9 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3495434 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3495434 /var/tmp/spdk-raid.sock 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3495434 ']' 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:54.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:54.628 22:27:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.628 [2024-07-12 22:27:04.804512] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:19:54.628 [2024-07-12 22:27:04.804587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3495434 ] 00:19:54.628 [2024-07-12 22:27:04.935201] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:54.885 [2024-07-12 22:27:05.035508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:54.885 [2024-07-12 22:27:05.101232] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:54.885 [2024-07-12 22:27:05.101274] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:55.451 22:27:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:55.451 22:27:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:55.451 22:27:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:55.451 22:27:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:55.708 BaseBdev1_malloc 00:19:55.708 22:27:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:55.964 true 00:19:55.964 22:27:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:56.221 [2024-07-12 22:27:06.447993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:56.221 [2024-07-12 22:27:06.448055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.221 [2024-07-12 22:27:06.448078] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d1f0d0 00:19:56.221 [2024-07-12 22:27:06.448091] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.221 [2024-07-12 22:27:06.449952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.221 [2024-07-12 22:27:06.449981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:56.221 BaseBdev1 00:19:56.221 22:27:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:56.221 22:27:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:56.785 BaseBdev2_malloc 00:19:56.785 22:27:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:57.042 true 00:19:57.042 22:27:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:57.606 [2024-07-12 22:27:07.701278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:57.606 [2024-07-12 22:27:07.701328] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.606 [2024-07-12 22:27:07.701350] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d23910 00:19:57.606 [2024-07-12 22:27:07.701363] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.606 [2024-07-12 22:27:07.702973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.606 [2024-07-12 22:27:07.703002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:57.606 BaseBdev2 00:19:57.606 22:27:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:57.606 22:27:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:57.862 BaseBdev3_malloc 00:19:57.862 22:27:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:58.118 true 00:19:58.118 22:27:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:58.118 [2024-07-12 22:27:08.420163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:58.118 [2024-07-12 22:27:08.420213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.118 [2024-07-12 22:27:08.420235] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d25bd0 00:19:58.118 [2024-07-12 22:27:08.420247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.118 [2024-07-12 22:27:08.421828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.119 [2024-07-12 22:27:08.421859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:58.119 BaseBdev3 00:19:58.119 22:27:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:58.119 22:27:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:58.682 BaseBdev4_malloc 00:19:58.682 22:27:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:58.938 true 00:19:58.938 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:59.501 [2024-07-12 22:27:09.673537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:59.501 [2024-07-12 22:27:09.673587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.501 [2024-07-12 22:27:09.673609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d26aa0 00:19:59.501 [2024-07-12 22:27:09.673622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.501 [2024-07-12 22:27:09.675216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.501 [2024-07-12 22:27:09.675245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:59.501 BaseBdev4 00:19:59.501 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:59.758 [2024-07-12 22:27:09.922237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:59.758 [2024-07-12 22:27:09.923660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:59.758 [2024-07-12 22:27:09.923730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:59.758 [2024-07-12 22:27:09.923791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:59.758 [2024-07-12 22:27:09.924044] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d20c20 00:19:59.758 [2024-07-12 22:27:09.924056] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:59.758 [2024-07-12 22:27:09.924265] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b75260 00:19:59.758 [2024-07-12 22:27:09.924423] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d20c20 00:19:59.758 [2024-07-12 22:27:09.924433] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d20c20 00:19:59.758 [2024-07-12 22:27:09.924545] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.758 22:27:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.015 22:27:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.015 "name": "raid_bdev1", 00:20:00.015 "uuid": "af07148e-fad6-4696-898b-ff4f0cc1952b", 00:20:00.015 "strip_size_kb": 64, 00:20:00.015 "state": "online", 00:20:00.015 "raid_level": "raid0", 00:20:00.015 "superblock": true, 00:20:00.015 "num_base_bdevs": 4, 00:20:00.015 "num_base_bdevs_discovered": 4, 00:20:00.015 "num_base_bdevs_operational": 4, 00:20:00.015 "base_bdevs_list": [ 00:20:00.015 { 00:20:00.015 "name": "BaseBdev1", 00:20:00.015 "uuid": "4866c443-9c06-5e42-b1ea-10f5db764783", 00:20:00.015 "is_configured": true, 00:20:00.015 "data_offset": 2048, 00:20:00.015 "data_size": 63488 00:20:00.015 }, 00:20:00.015 { 00:20:00.015 "name": "BaseBdev2", 00:20:00.015 "uuid": "750f2c94-7fbf-57b8-be6f-86cae87cfd52", 00:20:00.015 "is_configured": true, 00:20:00.015 "data_offset": 2048, 00:20:00.015 "data_size": 63488 00:20:00.015 }, 00:20:00.015 { 00:20:00.015 "name": "BaseBdev3", 00:20:00.015 "uuid": "07c434b6-8914-54e6-a21e-c148212e6bc0", 00:20:00.015 "is_configured": true, 00:20:00.015 "data_offset": 2048, 00:20:00.015 "data_size": 63488 00:20:00.015 }, 00:20:00.015 { 00:20:00.015 "name": "BaseBdev4", 00:20:00.015 "uuid": "f7baf761-f86b-5cbc-8603-81d26f7ce8d2", 00:20:00.015 "is_configured": true, 00:20:00.015 "data_offset": 2048, 00:20:00.015 "data_size": 63488 00:20:00.015 } 00:20:00.015 ] 00:20:00.015 }' 00:20:00.015 22:27:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.015 22:27:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.578 22:27:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:00.578 22:27:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:00.578 [2024-07-12 22:27:10.865016] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d12fc0 00:20:01.513 22:27:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.771 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.772 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.033 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.033 "name": "raid_bdev1", 00:20:02.033 "uuid": "af07148e-fad6-4696-898b-ff4f0cc1952b", 00:20:02.033 "strip_size_kb": 64, 00:20:02.033 "state": "online", 00:20:02.033 "raid_level": "raid0", 00:20:02.033 "superblock": true, 00:20:02.033 "num_base_bdevs": 4, 00:20:02.034 "num_base_bdevs_discovered": 4, 00:20:02.034 "num_base_bdevs_operational": 4, 00:20:02.034 "base_bdevs_list": [ 00:20:02.034 { 00:20:02.034 "name": "BaseBdev1", 00:20:02.034 "uuid": "4866c443-9c06-5e42-b1ea-10f5db764783", 00:20:02.034 "is_configured": true, 00:20:02.034 "data_offset": 2048, 00:20:02.034 "data_size": 63488 00:20:02.034 }, 00:20:02.034 { 00:20:02.034 "name": "BaseBdev2", 00:20:02.034 "uuid": "750f2c94-7fbf-57b8-be6f-86cae87cfd52", 00:20:02.034 "is_configured": true, 00:20:02.034 "data_offset": 2048, 00:20:02.034 "data_size": 63488 00:20:02.034 }, 00:20:02.034 { 00:20:02.034 "name": "BaseBdev3", 00:20:02.034 "uuid": "07c434b6-8914-54e6-a21e-c148212e6bc0", 00:20:02.034 "is_configured": true, 00:20:02.034 "data_offset": 2048, 00:20:02.034 "data_size": 63488 00:20:02.034 }, 00:20:02.034 { 00:20:02.034 "name": "BaseBdev4", 00:20:02.034 "uuid": "f7baf761-f86b-5cbc-8603-81d26f7ce8d2", 00:20:02.034 "is_configured": true, 00:20:02.034 "data_offset": 2048, 00:20:02.034 "data_size": 63488 00:20:02.034 } 00:20:02.034 ] 00:20:02.034 }' 00:20:02.034 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.034 22:27:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.657 22:27:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:02.916 [2024-07-12 22:27:13.098156] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:02.916 [2024-07-12 22:27:13.098200] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:02.916 [2024-07-12 22:27:13.101370] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:02.916 [2024-07-12 22:27:13.101409] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.916 [2024-07-12 22:27:13.101450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:02.916 [2024-07-12 22:27:13.101462] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d20c20 name raid_bdev1, state offline 00:20:02.916 0 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3495434 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3495434 ']' 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3495434 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3495434 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3495434' 00:20:02.916 killing process with pid 3495434 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3495434 00:20:02.916 [2024-07-12 22:27:13.165666] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:02.916 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3495434 00:20:02.916 [2024-07-12 22:27:13.196771] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Y1N1VklKd9 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:03.175 22:27:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:03.175 00:20:03.175 real 0m8.696s 00:20:03.175 user 0m14.253s 00:20:03.175 sys 0m1.404s 00:20:03.176 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:03.176 22:27:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.176 ************************************ 00:20:03.176 END TEST raid_write_error_test 00:20:03.176 ************************************ 00:20:03.176 22:27:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:03.176 22:27:13 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:03.176 22:27:13 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:03.176 22:27:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:03.176 22:27:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:03.176 22:27:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:03.176 ************************************ 00:20:03.176 START TEST raid_state_function_test 00:20:03.176 ************************************ 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:03.176 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3496834 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3496834' 00:20:03.435 Process raid pid: 3496834 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3496834 /var/tmp/spdk-raid.sock 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3496834 ']' 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:03.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:03.435 22:27:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.435 [2024-07-12 22:27:13.562179] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:20:03.435 [2024-07-12 22:27:13.562244] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:03.435 [2024-07-12 22:27:13.689468] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:03.695 [2024-07-12 22:27:13.795488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:03.695 [2024-07-12 22:27:13.859093] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:03.695 [2024-07-12 22:27:13.859128] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:04.265 22:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:04.265 22:27:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:04.265 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:04.524 [2024-07-12 22:27:14.686565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:04.524 [2024-07-12 22:27:14.686619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:04.524 [2024-07-12 22:27:14.686630] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:04.524 [2024-07-12 22:27:14.686642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:04.524 [2024-07-12 22:27:14.686650] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:04.524 [2024-07-12 22:27:14.686662] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:04.524 [2024-07-12 22:27:14.686670] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:04.524 [2024-07-12 22:27:14.686681] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.524 22:27:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:05.093 22:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.093 "name": "Existed_Raid", 00:20:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.093 "strip_size_kb": 64, 00:20:05.093 "state": "configuring", 00:20:05.093 "raid_level": "concat", 00:20:05.093 "superblock": false, 00:20:05.093 "num_base_bdevs": 4, 00:20:05.093 "num_base_bdevs_discovered": 0, 00:20:05.093 "num_base_bdevs_operational": 4, 00:20:05.093 "base_bdevs_list": [ 00:20:05.093 { 00:20:05.093 "name": "BaseBdev1", 00:20:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.093 "is_configured": false, 00:20:05.093 "data_offset": 0, 00:20:05.093 "data_size": 0 00:20:05.093 }, 00:20:05.093 { 00:20:05.093 "name": "BaseBdev2", 00:20:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.093 "is_configured": false, 00:20:05.093 "data_offset": 0, 00:20:05.093 "data_size": 0 00:20:05.093 }, 00:20:05.093 { 00:20:05.093 "name": "BaseBdev3", 00:20:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.093 "is_configured": false, 00:20:05.093 "data_offset": 0, 00:20:05.093 "data_size": 0 00:20:05.093 }, 00:20:05.093 { 00:20:05.093 "name": "BaseBdev4", 00:20:05.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.093 "is_configured": false, 00:20:05.093 "data_offset": 0, 00:20:05.093 "data_size": 0 00:20:05.093 } 00:20:05.093 ] 00:20:05.093 }' 00:20:05.093 22:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.093 22:27:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.660 22:27:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:05.919 [2024-07-12 22:27:16.005881] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:05.919 [2024-07-12 22:27:16.005914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bfaa0 name Existed_Raid, state configuring 00:20:05.919 22:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:06.179 [2024-07-12 22:27:16.250552] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:06.179 [2024-07-12 22:27:16.250579] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:06.179 [2024-07-12 22:27:16.250589] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:06.179 [2024-07-12 22:27:16.250600] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:06.179 [2024-07-12 22:27:16.250609] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:06.179 [2024-07-12 22:27:16.250620] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:06.179 [2024-07-12 22:27:16.250629] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:06.179 [2024-07-12 22:27:16.250640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:06.179 22:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:06.438 [2024-07-12 22:27:16.509127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:06.438 BaseBdev1 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:06.438 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:06.698 22:27:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:06.698 [ 00:20:06.698 { 00:20:06.698 "name": "BaseBdev1", 00:20:06.698 "aliases": [ 00:20:06.698 "75ee1d31-6789-4fe9-baee-ab5954fe8908" 00:20:06.698 ], 00:20:06.698 "product_name": "Malloc disk", 00:20:06.698 "block_size": 512, 00:20:06.698 "num_blocks": 65536, 00:20:06.698 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:06.698 "assigned_rate_limits": { 00:20:06.698 "rw_ios_per_sec": 0, 00:20:06.698 "rw_mbytes_per_sec": 0, 00:20:06.698 "r_mbytes_per_sec": 0, 00:20:06.698 "w_mbytes_per_sec": 0 00:20:06.698 }, 00:20:06.698 "claimed": true, 00:20:06.698 "claim_type": "exclusive_write", 00:20:06.698 "zoned": false, 00:20:06.698 "supported_io_types": { 00:20:06.698 "read": true, 00:20:06.698 "write": true, 00:20:06.698 "unmap": true, 00:20:06.698 "flush": true, 00:20:06.698 "reset": true, 00:20:06.698 "nvme_admin": false, 00:20:06.698 "nvme_io": false, 00:20:06.698 "nvme_io_md": false, 00:20:06.698 "write_zeroes": true, 00:20:06.698 "zcopy": true, 00:20:06.698 "get_zone_info": false, 00:20:06.698 "zone_management": false, 00:20:06.698 "zone_append": false, 00:20:06.698 "compare": false, 00:20:06.698 "compare_and_write": false, 00:20:06.698 "abort": true, 00:20:06.698 "seek_hole": false, 00:20:06.698 "seek_data": false, 00:20:06.698 "copy": true, 00:20:06.698 "nvme_iov_md": false 00:20:06.698 }, 00:20:06.698 "memory_domains": [ 00:20:06.698 { 00:20:06.698 "dma_device_id": "system", 00:20:06.698 "dma_device_type": 1 00:20:06.698 }, 00:20:06.698 { 00:20:06.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.698 "dma_device_type": 2 00:20:06.698 } 00:20:06.698 ], 00:20:06.698 "driver_specific": {} 00:20:06.698 } 00:20:06.698 ] 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.698 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.957 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.957 "name": "Existed_Raid", 00:20:06.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.957 "strip_size_kb": 64, 00:20:06.957 "state": "configuring", 00:20:06.957 "raid_level": "concat", 00:20:06.957 "superblock": false, 00:20:06.957 "num_base_bdevs": 4, 00:20:06.957 "num_base_bdevs_discovered": 1, 00:20:06.957 "num_base_bdevs_operational": 4, 00:20:06.957 "base_bdevs_list": [ 00:20:06.957 { 00:20:06.957 "name": "BaseBdev1", 00:20:06.957 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:06.957 "is_configured": true, 00:20:06.957 "data_offset": 0, 00:20:06.957 "data_size": 65536 00:20:06.957 }, 00:20:06.957 { 00:20:06.957 "name": "BaseBdev2", 00:20:06.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.957 "is_configured": false, 00:20:06.957 "data_offset": 0, 00:20:06.957 "data_size": 0 00:20:06.957 }, 00:20:06.957 { 00:20:06.957 "name": "BaseBdev3", 00:20:06.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.957 "is_configured": false, 00:20:06.957 "data_offset": 0, 00:20:06.957 "data_size": 0 00:20:06.957 }, 00:20:06.957 { 00:20:06.957 "name": "BaseBdev4", 00:20:06.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.957 "is_configured": false, 00:20:06.957 "data_offset": 0, 00:20:06.957 "data_size": 0 00:20:06.957 } 00:20:06.957 ] 00:20:06.958 }' 00:20:06.958 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.958 22:27:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.526 22:27:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:07.785 [2024-07-12 22:27:18.013114] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:07.785 [2024-07-12 22:27:18.013153] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bf310 name Existed_Raid, state configuring 00:20:07.785 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:08.044 [2024-07-12 22:27:18.261808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:08.044 [2024-07-12 22:27:18.263252] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:08.044 [2024-07-12 22:27:18.263284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:08.044 [2024-07-12 22:27:18.263296] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:08.044 [2024-07-12 22:27:18.263308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:08.044 [2024-07-12 22:27:18.263317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:08.044 [2024-07-12 22:27:18.263328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.044 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.303 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.303 "name": "Existed_Raid", 00:20:08.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.303 "strip_size_kb": 64, 00:20:08.303 "state": "configuring", 00:20:08.303 "raid_level": "concat", 00:20:08.303 "superblock": false, 00:20:08.303 "num_base_bdevs": 4, 00:20:08.303 "num_base_bdevs_discovered": 1, 00:20:08.303 "num_base_bdevs_operational": 4, 00:20:08.303 "base_bdevs_list": [ 00:20:08.303 { 00:20:08.303 "name": "BaseBdev1", 00:20:08.303 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:08.303 "is_configured": true, 00:20:08.303 "data_offset": 0, 00:20:08.303 "data_size": 65536 00:20:08.303 }, 00:20:08.303 { 00:20:08.303 "name": "BaseBdev2", 00:20:08.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.303 "is_configured": false, 00:20:08.303 "data_offset": 0, 00:20:08.303 "data_size": 0 00:20:08.303 }, 00:20:08.303 { 00:20:08.303 "name": "BaseBdev3", 00:20:08.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.303 "is_configured": false, 00:20:08.303 "data_offset": 0, 00:20:08.303 "data_size": 0 00:20:08.303 }, 00:20:08.303 { 00:20:08.303 "name": "BaseBdev4", 00:20:08.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.303 "is_configured": false, 00:20:08.303 "data_offset": 0, 00:20:08.303 "data_size": 0 00:20:08.303 } 00:20:08.303 ] 00:20:08.303 }' 00:20:08.303 22:27:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.303 22:27:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.870 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:09.129 [2024-07-12 22:27:19.372245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:09.129 BaseBdev2 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:09.129 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:09.389 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:09.649 [ 00:20:09.649 { 00:20:09.649 "name": "BaseBdev2", 00:20:09.649 "aliases": [ 00:20:09.649 "c86314a3-238a-4bce-8799-c3e88904d77b" 00:20:09.649 ], 00:20:09.649 "product_name": "Malloc disk", 00:20:09.649 "block_size": 512, 00:20:09.649 "num_blocks": 65536, 00:20:09.649 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:09.649 "assigned_rate_limits": { 00:20:09.649 "rw_ios_per_sec": 0, 00:20:09.649 "rw_mbytes_per_sec": 0, 00:20:09.649 "r_mbytes_per_sec": 0, 00:20:09.649 "w_mbytes_per_sec": 0 00:20:09.649 }, 00:20:09.649 "claimed": true, 00:20:09.649 "claim_type": "exclusive_write", 00:20:09.649 "zoned": false, 00:20:09.649 "supported_io_types": { 00:20:09.649 "read": true, 00:20:09.649 "write": true, 00:20:09.649 "unmap": true, 00:20:09.649 "flush": true, 00:20:09.649 "reset": true, 00:20:09.649 "nvme_admin": false, 00:20:09.649 "nvme_io": false, 00:20:09.649 "nvme_io_md": false, 00:20:09.649 "write_zeroes": true, 00:20:09.649 "zcopy": true, 00:20:09.649 "get_zone_info": false, 00:20:09.649 "zone_management": false, 00:20:09.649 "zone_append": false, 00:20:09.649 "compare": false, 00:20:09.649 "compare_and_write": false, 00:20:09.649 "abort": true, 00:20:09.649 "seek_hole": false, 00:20:09.649 "seek_data": false, 00:20:09.649 "copy": true, 00:20:09.649 "nvme_iov_md": false 00:20:09.649 }, 00:20:09.649 "memory_domains": [ 00:20:09.649 { 00:20:09.649 "dma_device_id": "system", 00:20:09.649 "dma_device_type": 1 00:20:09.649 }, 00:20:09.649 { 00:20:09.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.649 "dma_device_type": 2 00:20:09.649 } 00:20:09.649 ], 00:20:09.649 "driver_specific": {} 00:20:09.649 } 00:20:09.649 ] 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.649 22:27:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.909 22:27:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.909 "name": "Existed_Raid", 00:20:09.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.909 "strip_size_kb": 64, 00:20:09.909 "state": "configuring", 00:20:09.909 "raid_level": "concat", 00:20:09.909 "superblock": false, 00:20:09.909 "num_base_bdevs": 4, 00:20:09.909 "num_base_bdevs_discovered": 2, 00:20:09.909 "num_base_bdevs_operational": 4, 00:20:09.909 "base_bdevs_list": [ 00:20:09.909 { 00:20:09.909 "name": "BaseBdev1", 00:20:09.909 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:09.909 "is_configured": true, 00:20:09.909 "data_offset": 0, 00:20:09.909 "data_size": 65536 00:20:09.909 }, 00:20:09.909 { 00:20:09.909 "name": "BaseBdev2", 00:20:09.909 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:09.909 "is_configured": true, 00:20:09.909 "data_offset": 0, 00:20:09.909 "data_size": 65536 00:20:09.909 }, 00:20:09.909 { 00:20:09.909 "name": "BaseBdev3", 00:20:09.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.909 "is_configured": false, 00:20:09.909 "data_offset": 0, 00:20:09.909 "data_size": 0 00:20:09.909 }, 00:20:09.909 { 00:20:09.909 "name": "BaseBdev4", 00:20:09.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.909 "is_configured": false, 00:20:09.909 "data_offset": 0, 00:20:09.909 "data_size": 0 00:20:09.909 } 00:20:09.909 ] 00:20:09.909 }' 00:20:09.909 22:27:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.909 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.478 22:27:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:10.735 [2024-07-12 22:27:20.899674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.735 BaseBdev3 00:20:10.735 22:27:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:10.736 22:27:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:10.994 22:27:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:11.254 [ 00:20:11.254 { 00:20:11.254 "name": "BaseBdev3", 00:20:11.254 "aliases": [ 00:20:11.254 "15f79a78-8f92-4698-a6f5-5b8f6dcd8699" 00:20:11.254 ], 00:20:11.254 "product_name": "Malloc disk", 00:20:11.254 "block_size": 512, 00:20:11.254 "num_blocks": 65536, 00:20:11.254 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:11.254 "assigned_rate_limits": { 00:20:11.254 "rw_ios_per_sec": 0, 00:20:11.254 "rw_mbytes_per_sec": 0, 00:20:11.254 "r_mbytes_per_sec": 0, 00:20:11.254 "w_mbytes_per_sec": 0 00:20:11.254 }, 00:20:11.254 "claimed": true, 00:20:11.254 "claim_type": "exclusive_write", 00:20:11.254 "zoned": false, 00:20:11.254 "supported_io_types": { 00:20:11.254 "read": true, 00:20:11.254 "write": true, 00:20:11.254 "unmap": true, 00:20:11.254 "flush": true, 00:20:11.254 "reset": true, 00:20:11.254 "nvme_admin": false, 00:20:11.254 "nvme_io": false, 00:20:11.254 "nvme_io_md": false, 00:20:11.254 "write_zeroes": true, 00:20:11.254 "zcopy": true, 00:20:11.254 "get_zone_info": false, 00:20:11.254 "zone_management": false, 00:20:11.254 "zone_append": false, 00:20:11.254 "compare": false, 00:20:11.254 "compare_and_write": false, 00:20:11.254 "abort": true, 00:20:11.254 "seek_hole": false, 00:20:11.254 "seek_data": false, 00:20:11.254 "copy": true, 00:20:11.254 "nvme_iov_md": false 00:20:11.254 }, 00:20:11.254 "memory_domains": [ 00:20:11.254 { 00:20:11.254 "dma_device_id": "system", 00:20:11.254 "dma_device_type": 1 00:20:11.254 }, 00:20:11.254 { 00:20:11.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.254 "dma_device_type": 2 00:20:11.254 } 00:20:11.254 ], 00:20:11.254 "driver_specific": {} 00:20:11.254 } 00:20:11.254 ] 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.254 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.514 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.514 "name": "Existed_Raid", 00:20:11.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.514 "strip_size_kb": 64, 00:20:11.514 "state": "configuring", 00:20:11.514 "raid_level": "concat", 00:20:11.514 "superblock": false, 00:20:11.514 "num_base_bdevs": 4, 00:20:11.514 "num_base_bdevs_discovered": 3, 00:20:11.514 "num_base_bdevs_operational": 4, 00:20:11.514 "base_bdevs_list": [ 00:20:11.514 { 00:20:11.514 "name": "BaseBdev1", 00:20:11.514 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:11.514 "is_configured": true, 00:20:11.514 "data_offset": 0, 00:20:11.514 "data_size": 65536 00:20:11.514 }, 00:20:11.514 { 00:20:11.514 "name": "BaseBdev2", 00:20:11.514 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:11.514 "is_configured": true, 00:20:11.514 "data_offset": 0, 00:20:11.514 "data_size": 65536 00:20:11.514 }, 00:20:11.514 { 00:20:11.514 "name": "BaseBdev3", 00:20:11.514 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:11.514 "is_configured": true, 00:20:11.514 "data_offset": 0, 00:20:11.514 "data_size": 65536 00:20:11.514 }, 00:20:11.514 { 00:20:11.514 "name": "BaseBdev4", 00:20:11.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.514 "is_configured": false, 00:20:11.514 "data_offset": 0, 00:20:11.514 "data_size": 0 00:20:11.514 } 00:20:11.514 ] 00:20:11.514 }' 00:20:11.514 22:27:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.514 22:27:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:12.083 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:12.342 [2024-07-12 22:27:22.455166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:12.343 [2024-07-12 22:27:22.455206] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c0350 00:20:12.343 [2024-07-12 22:27:22.455215] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:12.343 [2024-07-12 22:27:22.455467] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c0020 00:20:12.343 [2024-07-12 22:27:22.455590] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c0350 00:20:12.343 [2024-07-12 22:27:22.455601] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16c0350 00:20:12.343 [2024-07-12 22:27:22.455767] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.343 BaseBdev4 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:12.343 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:12.602 [ 00:20:12.602 { 00:20:12.602 "name": "BaseBdev4", 00:20:12.602 "aliases": [ 00:20:12.602 "a9c8b12a-3696-4177-a897-03168baf434f" 00:20:12.602 ], 00:20:12.602 "product_name": "Malloc disk", 00:20:12.602 "block_size": 512, 00:20:12.602 "num_blocks": 65536, 00:20:12.602 "uuid": "a9c8b12a-3696-4177-a897-03168baf434f", 00:20:12.602 "assigned_rate_limits": { 00:20:12.602 "rw_ios_per_sec": 0, 00:20:12.602 "rw_mbytes_per_sec": 0, 00:20:12.602 "r_mbytes_per_sec": 0, 00:20:12.602 "w_mbytes_per_sec": 0 00:20:12.602 }, 00:20:12.602 "claimed": true, 00:20:12.602 "claim_type": "exclusive_write", 00:20:12.602 "zoned": false, 00:20:12.602 "supported_io_types": { 00:20:12.602 "read": true, 00:20:12.602 "write": true, 00:20:12.602 "unmap": true, 00:20:12.602 "flush": true, 00:20:12.602 "reset": true, 00:20:12.602 "nvme_admin": false, 00:20:12.602 "nvme_io": false, 00:20:12.602 "nvme_io_md": false, 00:20:12.602 "write_zeroes": true, 00:20:12.602 "zcopy": true, 00:20:12.602 "get_zone_info": false, 00:20:12.602 "zone_management": false, 00:20:12.602 "zone_append": false, 00:20:12.602 "compare": false, 00:20:12.602 "compare_and_write": false, 00:20:12.602 "abort": true, 00:20:12.602 "seek_hole": false, 00:20:12.602 "seek_data": false, 00:20:12.602 "copy": true, 00:20:12.602 "nvme_iov_md": false 00:20:12.602 }, 00:20:12.602 "memory_domains": [ 00:20:12.602 { 00:20:12.602 "dma_device_id": "system", 00:20:12.602 "dma_device_type": 1 00:20:12.602 }, 00:20:12.602 { 00:20:12.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.602 "dma_device_type": 2 00:20:12.602 } 00:20:12.602 ], 00:20:12.602 "driver_specific": {} 00:20:12.602 } 00:20:12.602 ] 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.602 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.603 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.603 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.603 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.603 22:27:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.861 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.861 "name": "Existed_Raid", 00:20:12.861 "uuid": "4ac2c682-c8d9-444b-9266-fddcd3850387", 00:20:12.861 "strip_size_kb": 64, 00:20:12.861 "state": "online", 00:20:12.861 "raid_level": "concat", 00:20:12.861 "superblock": false, 00:20:12.861 "num_base_bdevs": 4, 00:20:12.861 "num_base_bdevs_discovered": 4, 00:20:12.861 "num_base_bdevs_operational": 4, 00:20:12.861 "base_bdevs_list": [ 00:20:12.861 { 00:20:12.861 "name": "BaseBdev1", 00:20:12.861 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:12.861 "is_configured": true, 00:20:12.861 "data_offset": 0, 00:20:12.861 "data_size": 65536 00:20:12.861 }, 00:20:12.861 { 00:20:12.861 "name": "BaseBdev2", 00:20:12.861 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:12.861 "is_configured": true, 00:20:12.861 "data_offset": 0, 00:20:12.861 "data_size": 65536 00:20:12.861 }, 00:20:12.861 { 00:20:12.861 "name": "BaseBdev3", 00:20:12.861 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:12.861 "is_configured": true, 00:20:12.861 "data_offset": 0, 00:20:12.861 "data_size": 65536 00:20:12.861 }, 00:20:12.861 { 00:20:12.861 "name": "BaseBdev4", 00:20:12.861 "uuid": "a9c8b12a-3696-4177-a897-03168baf434f", 00:20:12.861 "is_configured": true, 00:20:12.861 "data_offset": 0, 00:20:12.861 "data_size": 65536 00:20:12.861 } 00:20:12.861 ] 00:20:12.861 }' 00:20:12.861 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.861 22:27:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:13.427 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:13.685 [2024-07-12 22:27:23.807099] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.685 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:13.685 "name": "Existed_Raid", 00:20:13.685 "aliases": [ 00:20:13.685 "4ac2c682-c8d9-444b-9266-fddcd3850387" 00:20:13.685 ], 00:20:13.685 "product_name": "Raid Volume", 00:20:13.685 "block_size": 512, 00:20:13.685 "num_blocks": 262144, 00:20:13.685 "uuid": "4ac2c682-c8d9-444b-9266-fddcd3850387", 00:20:13.685 "assigned_rate_limits": { 00:20:13.685 "rw_ios_per_sec": 0, 00:20:13.685 "rw_mbytes_per_sec": 0, 00:20:13.685 "r_mbytes_per_sec": 0, 00:20:13.685 "w_mbytes_per_sec": 0 00:20:13.685 }, 00:20:13.685 "claimed": false, 00:20:13.685 "zoned": false, 00:20:13.685 "supported_io_types": { 00:20:13.685 "read": true, 00:20:13.685 "write": true, 00:20:13.686 "unmap": true, 00:20:13.686 "flush": true, 00:20:13.686 "reset": true, 00:20:13.686 "nvme_admin": false, 00:20:13.686 "nvme_io": false, 00:20:13.686 "nvme_io_md": false, 00:20:13.686 "write_zeroes": true, 00:20:13.686 "zcopy": false, 00:20:13.686 "get_zone_info": false, 00:20:13.686 "zone_management": false, 00:20:13.686 "zone_append": false, 00:20:13.686 "compare": false, 00:20:13.686 "compare_and_write": false, 00:20:13.686 "abort": false, 00:20:13.686 "seek_hole": false, 00:20:13.686 "seek_data": false, 00:20:13.686 "copy": false, 00:20:13.686 "nvme_iov_md": false 00:20:13.686 }, 00:20:13.686 "memory_domains": [ 00:20:13.686 { 00:20:13.686 "dma_device_id": "system", 00:20:13.686 "dma_device_type": 1 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.686 "dma_device_type": 2 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "system", 00:20:13.686 "dma_device_type": 1 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.686 "dma_device_type": 2 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "system", 00:20:13.686 "dma_device_type": 1 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.686 "dma_device_type": 2 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "system", 00:20:13.686 "dma_device_type": 1 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.686 "dma_device_type": 2 00:20:13.686 } 00:20:13.686 ], 00:20:13.686 "driver_specific": { 00:20:13.686 "raid": { 00:20:13.686 "uuid": "4ac2c682-c8d9-444b-9266-fddcd3850387", 00:20:13.686 "strip_size_kb": 64, 00:20:13.686 "state": "online", 00:20:13.686 "raid_level": "concat", 00:20:13.686 "superblock": false, 00:20:13.686 "num_base_bdevs": 4, 00:20:13.686 "num_base_bdevs_discovered": 4, 00:20:13.686 "num_base_bdevs_operational": 4, 00:20:13.686 "base_bdevs_list": [ 00:20:13.686 { 00:20:13.686 "name": "BaseBdev1", 00:20:13.686 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:13.686 "is_configured": true, 00:20:13.686 "data_offset": 0, 00:20:13.686 "data_size": 65536 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "name": "BaseBdev2", 00:20:13.686 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:13.686 "is_configured": true, 00:20:13.686 "data_offset": 0, 00:20:13.686 "data_size": 65536 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "name": "BaseBdev3", 00:20:13.686 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:13.686 "is_configured": true, 00:20:13.686 "data_offset": 0, 00:20:13.686 "data_size": 65536 00:20:13.686 }, 00:20:13.686 { 00:20:13.686 "name": "BaseBdev4", 00:20:13.686 "uuid": "a9c8b12a-3696-4177-a897-03168baf434f", 00:20:13.686 "is_configured": true, 00:20:13.686 "data_offset": 0, 00:20:13.686 "data_size": 65536 00:20:13.686 } 00:20:13.686 ] 00:20:13.686 } 00:20:13.686 } 00:20:13.686 }' 00:20:13.686 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:13.686 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:13.686 BaseBdev2 00:20:13.686 BaseBdev3 00:20:13.686 BaseBdev4' 00:20:13.686 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.686 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:13.686 22:27:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:13.944 "name": "BaseBdev1", 00:20:13.944 "aliases": [ 00:20:13.944 "75ee1d31-6789-4fe9-baee-ab5954fe8908" 00:20:13.944 ], 00:20:13.944 "product_name": "Malloc disk", 00:20:13.944 "block_size": 512, 00:20:13.944 "num_blocks": 65536, 00:20:13.944 "uuid": "75ee1d31-6789-4fe9-baee-ab5954fe8908", 00:20:13.944 "assigned_rate_limits": { 00:20:13.944 "rw_ios_per_sec": 0, 00:20:13.944 "rw_mbytes_per_sec": 0, 00:20:13.944 "r_mbytes_per_sec": 0, 00:20:13.944 "w_mbytes_per_sec": 0 00:20:13.944 }, 00:20:13.944 "claimed": true, 00:20:13.944 "claim_type": "exclusive_write", 00:20:13.944 "zoned": false, 00:20:13.944 "supported_io_types": { 00:20:13.944 "read": true, 00:20:13.944 "write": true, 00:20:13.944 "unmap": true, 00:20:13.944 "flush": true, 00:20:13.944 "reset": true, 00:20:13.944 "nvme_admin": false, 00:20:13.944 "nvme_io": false, 00:20:13.944 "nvme_io_md": false, 00:20:13.944 "write_zeroes": true, 00:20:13.944 "zcopy": true, 00:20:13.944 "get_zone_info": false, 00:20:13.944 "zone_management": false, 00:20:13.944 "zone_append": false, 00:20:13.944 "compare": false, 00:20:13.944 "compare_and_write": false, 00:20:13.944 "abort": true, 00:20:13.944 "seek_hole": false, 00:20:13.944 "seek_data": false, 00:20:13.944 "copy": true, 00:20:13.944 "nvme_iov_md": false 00:20:13.944 }, 00:20:13.944 "memory_domains": [ 00:20:13.944 { 00:20:13.944 "dma_device_id": "system", 00:20:13.944 "dma_device_type": 1 00:20:13.944 }, 00:20:13.944 { 00:20:13.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.944 "dma_device_type": 2 00:20:13.944 } 00:20:13.944 ], 00:20:13.944 "driver_specific": {} 00:20:13.944 }' 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.944 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:14.203 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.462 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.462 "name": "BaseBdev2", 00:20:14.462 "aliases": [ 00:20:14.462 "c86314a3-238a-4bce-8799-c3e88904d77b" 00:20:14.462 ], 00:20:14.462 "product_name": "Malloc disk", 00:20:14.462 "block_size": 512, 00:20:14.462 "num_blocks": 65536, 00:20:14.462 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:14.462 "assigned_rate_limits": { 00:20:14.462 "rw_ios_per_sec": 0, 00:20:14.462 "rw_mbytes_per_sec": 0, 00:20:14.462 "r_mbytes_per_sec": 0, 00:20:14.462 "w_mbytes_per_sec": 0 00:20:14.462 }, 00:20:14.462 "claimed": true, 00:20:14.462 "claim_type": "exclusive_write", 00:20:14.462 "zoned": false, 00:20:14.462 "supported_io_types": { 00:20:14.462 "read": true, 00:20:14.462 "write": true, 00:20:14.462 "unmap": true, 00:20:14.462 "flush": true, 00:20:14.462 "reset": true, 00:20:14.462 "nvme_admin": false, 00:20:14.462 "nvme_io": false, 00:20:14.462 "nvme_io_md": false, 00:20:14.462 "write_zeroes": true, 00:20:14.462 "zcopy": true, 00:20:14.462 "get_zone_info": false, 00:20:14.462 "zone_management": false, 00:20:14.462 "zone_append": false, 00:20:14.462 "compare": false, 00:20:14.462 "compare_and_write": false, 00:20:14.462 "abort": true, 00:20:14.462 "seek_hole": false, 00:20:14.462 "seek_data": false, 00:20:14.462 "copy": true, 00:20:14.462 "nvme_iov_md": false 00:20:14.462 }, 00:20:14.462 "memory_domains": [ 00:20:14.462 { 00:20:14.462 "dma_device_id": "system", 00:20:14.462 "dma_device_type": 1 00:20:14.462 }, 00:20:14.462 { 00:20:14.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.462 "dma_device_type": 2 00:20:14.462 } 00:20:14.462 ], 00:20:14.462 "driver_specific": {} 00:20:14.462 }' 00:20:14.462 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.462 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.721 22:27:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.721 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.721 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.721 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.721 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.721 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:14.979 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.979 "name": "BaseBdev3", 00:20:14.979 "aliases": [ 00:20:14.979 "15f79a78-8f92-4698-a6f5-5b8f6dcd8699" 00:20:14.979 ], 00:20:14.979 "product_name": "Malloc disk", 00:20:14.979 "block_size": 512, 00:20:14.979 "num_blocks": 65536, 00:20:14.979 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:14.979 "assigned_rate_limits": { 00:20:14.979 "rw_ios_per_sec": 0, 00:20:14.979 "rw_mbytes_per_sec": 0, 00:20:14.979 "r_mbytes_per_sec": 0, 00:20:14.979 "w_mbytes_per_sec": 0 00:20:14.979 }, 00:20:14.979 "claimed": true, 00:20:14.979 "claim_type": "exclusive_write", 00:20:14.979 "zoned": false, 00:20:14.979 "supported_io_types": { 00:20:14.979 "read": true, 00:20:14.979 "write": true, 00:20:14.979 "unmap": true, 00:20:14.979 "flush": true, 00:20:14.979 "reset": true, 00:20:14.979 "nvme_admin": false, 00:20:14.979 "nvme_io": false, 00:20:14.979 "nvme_io_md": false, 00:20:14.979 "write_zeroes": true, 00:20:14.979 "zcopy": true, 00:20:14.979 "get_zone_info": false, 00:20:14.979 "zone_management": false, 00:20:14.979 "zone_append": false, 00:20:14.979 "compare": false, 00:20:14.979 "compare_and_write": false, 00:20:14.979 "abort": true, 00:20:14.979 "seek_hole": false, 00:20:14.980 "seek_data": false, 00:20:14.980 "copy": true, 00:20:14.980 "nvme_iov_md": false 00:20:14.980 }, 00:20:14.980 "memory_domains": [ 00:20:14.980 { 00:20:14.980 "dma_device_id": "system", 00:20:14.980 "dma_device_type": 1 00:20:14.980 }, 00:20:14.980 { 00:20:14.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.980 "dma_device_type": 2 00:20:14.980 } 00:20:14.980 ], 00:20:14.980 "driver_specific": {} 00:20:14.980 }' 00:20:14.980 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.238 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.497 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.497 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.497 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.497 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.755 "name": "BaseBdev4", 00:20:15.755 "aliases": [ 00:20:15.755 "a9c8b12a-3696-4177-a897-03168baf434f" 00:20:15.755 ], 00:20:15.755 "product_name": "Malloc disk", 00:20:15.755 "block_size": 512, 00:20:15.755 "num_blocks": 65536, 00:20:15.755 "uuid": "a9c8b12a-3696-4177-a897-03168baf434f", 00:20:15.755 "assigned_rate_limits": { 00:20:15.755 "rw_ios_per_sec": 0, 00:20:15.755 "rw_mbytes_per_sec": 0, 00:20:15.755 "r_mbytes_per_sec": 0, 00:20:15.755 "w_mbytes_per_sec": 0 00:20:15.755 }, 00:20:15.755 "claimed": true, 00:20:15.755 "claim_type": "exclusive_write", 00:20:15.755 "zoned": false, 00:20:15.755 "supported_io_types": { 00:20:15.755 "read": true, 00:20:15.755 "write": true, 00:20:15.755 "unmap": true, 00:20:15.755 "flush": true, 00:20:15.755 "reset": true, 00:20:15.755 "nvme_admin": false, 00:20:15.755 "nvme_io": false, 00:20:15.755 "nvme_io_md": false, 00:20:15.755 "write_zeroes": true, 00:20:15.755 "zcopy": true, 00:20:15.755 "get_zone_info": false, 00:20:15.755 "zone_management": false, 00:20:15.755 "zone_append": false, 00:20:15.755 "compare": false, 00:20:15.755 "compare_and_write": false, 00:20:15.755 "abort": true, 00:20:15.755 "seek_hole": false, 00:20:15.755 "seek_data": false, 00:20:15.755 "copy": true, 00:20:15.755 "nvme_iov_md": false 00:20:15.755 }, 00:20:15.755 "memory_domains": [ 00:20:15.755 { 00:20:15.755 "dma_device_id": "system", 00:20:15.755 "dma_device_type": 1 00:20:15.755 }, 00:20:15.755 { 00:20:15.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.755 "dma_device_type": 2 00:20:15.755 } 00:20:15.755 ], 00:20:15.755 "driver_specific": {} 00:20:15.755 }' 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.755 22:27:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.755 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.755 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.755 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.755 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.755 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.014 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.014 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.014 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:16.273 [2024-07-12 22:27:26.413724] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:16.273 [2024-07-12 22:27:26.413752] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:16.273 [2024-07-12 22:27:26.413801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.273 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.531 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.531 "name": "Existed_Raid", 00:20:16.531 "uuid": "4ac2c682-c8d9-444b-9266-fddcd3850387", 00:20:16.531 "strip_size_kb": 64, 00:20:16.531 "state": "offline", 00:20:16.531 "raid_level": "concat", 00:20:16.531 "superblock": false, 00:20:16.531 "num_base_bdevs": 4, 00:20:16.531 "num_base_bdevs_discovered": 3, 00:20:16.531 "num_base_bdevs_operational": 3, 00:20:16.531 "base_bdevs_list": [ 00:20:16.531 { 00:20:16.531 "name": null, 00:20:16.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.531 "is_configured": false, 00:20:16.531 "data_offset": 0, 00:20:16.531 "data_size": 65536 00:20:16.531 }, 00:20:16.531 { 00:20:16.531 "name": "BaseBdev2", 00:20:16.531 "uuid": "c86314a3-238a-4bce-8799-c3e88904d77b", 00:20:16.531 "is_configured": true, 00:20:16.531 "data_offset": 0, 00:20:16.531 "data_size": 65536 00:20:16.531 }, 00:20:16.531 { 00:20:16.531 "name": "BaseBdev3", 00:20:16.531 "uuid": "15f79a78-8f92-4698-a6f5-5b8f6dcd8699", 00:20:16.531 "is_configured": true, 00:20:16.531 "data_offset": 0, 00:20:16.531 "data_size": 65536 00:20:16.531 }, 00:20:16.531 { 00:20:16.531 "name": "BaseBdev4", 00:20:16.531 "uuid": "a9c8b12a-3696-4177-a897-03168baf434f", 00:20:16.531 "is_configured": true, 00:20:16.531 "data_offset": 0, 00:20:16.531 "data_size": 65536 00:20:16.531 } 00:20:16.531 ] 00:20:16.531 }' 00:20:16.531 22:27:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.531 22:27:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.100 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:17.100 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:17.100 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:17.100 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:17.359 [2024-07-12 22:27:27.622879] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.359 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:17.618 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:17.618 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:17.618 22:27:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:17.877 [2024-07-12 22:27:28.138900] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:17.877 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:17.877 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:17.877 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.877 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:18.136 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:18.136 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:18.136 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:18.394 [2024-07-12 22:27:28.646896] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:18.394 [2024-07-12 22:27:28.646946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c0350 name Existed_Raid, state offline 00:20:18.394 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:18.394 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:18.394 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.394 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:18.692 22:27:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:18.961 BaseBdev2 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:18.961 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.220 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:19.479 [ 00:20:19.479 { 00:20:19.479 "name": "BaseBdev2", 00:20:19.479 "aliases": [ 00:20:19.479 "a57aa934-f8fd-4989-acc6-edf1fc35902e" 00:20:19.479 ], 00:20:19.479 "product_name": "Malloc disk", 00:20:19.479 "block_size": 512, 00:20:19.479 "num_blocks": 65536, 00:20:19.479 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:19.479 "assigned_rate_limits": { 00:20:19.479 "rw_ios_per_sec": 0, 00:20:19.479 "rw_mbytes_per_sec": 0, 00:20:19.479 "r_mbytes_per_sec": 0, 00:20:19.479 "w_mbytes_per_sec": 0 00:20:19.479 }, 00:20:19.479 "claimed": false, 00:20:19.479 "zoned": false, 00:20:19.479 "supported_io_types": { 00:20:19.479 "read": true, 00:20:19.479 "write": true, 00:20:19.479 "unmap": true, 00:20:19.479 "flush": true, 00:20:19.479 "reset": true, 00:20:19.479 "nvme_admin": false, 00:20:19.479 "nvme_io": false, 00:20:19.479 "nvme_io_md": false, 00:20:19.479 "write_zeroes": true, 00:20:19.479 "zcopy": true, 00:20:19.479 "get_zone_info": false, 00:20:19.479 "zone_management": false, 00:20:19.479 "zone_append": false, 00:20:19.479 "compare": false, 00:20:19.479 "compare_and_write": false, 00:20:19.479 "abort": true, 00:20:19.479 "seek_hole": false, 00:20:19.479 "seek_data": false, 00:20:19.479 "copy": true, 00:20:19.479 "nvme_iov_md": false 00:20:19.479 }, 00:20:19.479 "memory_domains": [ 00:20:19.479 { 00:20:19.479 "dma_device_id": "system", 00:20:19.479 "dma_device_type": 1 00:20:19.479 }, 00:20:19.479 { 00:20:19.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.479 "dma_device_type": 2 00:20:19.479 } 00:20:19.479 ], 00:20:19.479 "driver_specific": {} 00:20:19.479 } 00:20:19.479 ] 00:20:19.479 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:19.479 22:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:19.479 22:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:19.479 22:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:19.739 BaseBdev3 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:19.739 22:27:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.997 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:20.255 [ 00:20:20.255 { 00:20:20.255 "name": "BaseBdev3", 00:20:20.255 "aliases": [ 00:20:20.255 "c3ea81c3-a5de-47dc-8874-f5e115189401" 00:20:20.255 ], 00:20:20.255 "product_name": "Malloc disk", 00:20:20.255 "block_size": 512, 00:20:20.255 "num_blocks": 65536, 00:20:20.255 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:20.255 "assigned_rate_limits": { 00:20:20.255 "rw_ios_per_sec": 0, 00:20:20.255 "rw_mbytes_per_sec": 0, 00:20:20.255 "r_mbytes_per_sec": 0, 00:20:20.255 "w_mbytes_per_sec": 0 00:20:20.255 }, 00:20:20.255 "claimed": false, 00:20:20.255 "zoned": false, 00:20:20.255 "supported_io_types": { 00:20:20.255 "read": true, 00:20:20.255 "write": true, 00:20:20.255 "unmap": true, 00:20:20.255 "flush": true, 00:20:20.255 "reset": true, 00:20:20.255 "nvme_admin": false, 00:20:20.255 "nvme_io": false, 00:20:20.255 "nvme_io_md": false, 00:20:20.255 "write_zeroes": true, 00:20:20.255 "zcopy": true, 00:20:20.255 "get_zone_info": false, 00:20:20.255 "zone_management": false, 00:20:20.255 "zone_append": false, 00:20:20.255 "compare": false, 00:20:20.255 "compare_and_write": false, 00:20:20.255 "abort": true, 00:20:20.255 "seek_hole": false, 00:20:20.255 "seek_data": false, 00:20:20.255 "copy": true, 00:20:20.255 "nvme_iov_md": false 00:20:20.255 }, 00:20:20.255 "memory_domains": [ 00:20:20.255 { 00:20:20.255 "dma_device_id": "system", 00:20:20.255 "dma_device_type": 1 00:20:20.255 }, 00:20:20.255 { 00:20:20.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.255 "dma_device_type": 2 00:20:20.255 } 00:20:20.255 ], 00:20:20.255 "driver_specific": {} 00:20:20.255 } 00:20:20.255 ] 00:20:20.255 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:20.255 22:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:20.255 22:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:20.255 22:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:20.513 BaseBdev4 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:20.513 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:20.771 22:27:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:21.030 [ 00:20:21.030 { 00:20:21.030 "name": "BaseBdev4", 00:20:21.030 "aliases": [ 00:20:21.030 "d2c93983-cdcc-4b76-adc0-4c07860e5ba7" 00:20:21.030 ], 00:20:21.030 "product_name": "Malloc disk", 00:20:21.030 "block_size": 512, 00:20:21.030 "num_blocks": 65536, 00:20:21.030 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:21.030 "assigned_rate_limits": { 00:20:21.030 "rw_ios_per_sec": 0, 00:20:21.030 "rw_mbytes_per_sec": 0, 00:20:21.030 "r_mbytes_per_sec": 0, 00:20:21.030 "w_mbytes_per_sec": 0 00:20:21.030 }, 00:20:21.030 "claimed": false, 00:20:21.030 "zoned": false, 00:20:21.030 "supported_io_types": { 00:20:21.030 "read": true, 00:20:21.030 "write": true, 00:20:21.030 "unmap": true, 00:20:21.030 "flush": true, 00:20:21.030 "reset": true, 00:20:21.030 "nvme_admin": false, 00:20:21.030 "nvme_io": false, 00:20:21.030 "nvme_io_md": false, 00:20:21.030 "write_zeroes": true, 00:20:21.030 "zcopy": true, 00:20:21.030 "get_zone_info": false, 00:20:21.030 "zone_management": false, 00:20:21.030 "zone_append": false, 00:20:21.030 "compare": false, 00:20:21.030 "compare_and_write": false, 00:20:21.030 "abort": true, 00:20:21.030 "seek_hole": false, 00:20:21.030 "seek_data": false, 00:20:21.030 "copy": true, 00:20:21.030 "nvme_iov_md": false 00:20:21.030 }, 00:20:21.030 "memory_domains": [ 00:20:21.030 { 00:20:21.030 "dma_device_id": "system", 00:20:21.030 "dma_device_type": 1 00:20:21.030 }, 00:20:21.030 { 00:20:21.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.030 "dma_device_type": 2 00:20:21.030 } 00:20:21.030 ], 00:20:21.030 "driver_specific": {} 00:20:21.030 } 00:20:21.030 ] 00:20:21.030 22:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:21.030 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:21.030 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:21.030 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:21.288 [2024-07-12 22:27:31.401170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:21.288 [2024-07-12 22:27:31.401208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:21.288 [2024-07-12 22:27:31.401227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:21.288 [2024-07-12 22:27:31.402537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:21.288 [2024-07-12 22:27:31.402578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.288 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.853 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.853 "name": "Existed_Raid", 00:20:21.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.853 "strip_size_kb": 64, 00:20:21.853 "state": "configuring", 00:20:21.853 "raid_level": "concat", 00:20:21.853 "superblock": false, 00:20:21.853 "num_base_bdevs": 4, 00:20:21.853 "num_base_bdevs_discovered": 3, 00:20:21.853 "num_base_bdevs_operational": 4, 00:20:21.853 "base_bdevs_list": [ 00:20:21.853 { 00:20:21.853 "name": "BaseBdev1", 00:20:21.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.853 "is_configured": false, 00:20:21.853 "data_offset": 0, 00:20:21.853 "data_size": 0 00:20:21.853 }, 00:20:21.853 { 00:20:21.853 "name": "BaseBdev2", 00:20:21.853 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:21.853 "is_configured": true, 00:20:21.853 "data_offset": 0, 00:20:21.853 "data_size": 65536 00:20:21.853 }, 00:20:21.853 { 00:20:21.853 "name": "BaseBdev3", 00:20:21.853 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:21.853 "is_configured": true, 00:20:21.853 "data_offset": 0, 00:20:21.853 "data_size": 65536 00:20:21.853 }, 00:20:21.853 { 00:20:21.853 "name": "BaseBdev4", 00:20:21.853 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:21.853 "is_configured": true, 00:20:21.853 "data_offset": 0, 00:20:21.853 "data_size": 65536 00:20:21.853 } 00:20:21.853 ] 00:20:21.853 }' 00:20:21.853 22:27:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.853 22:27:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.417 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:22.680 [2024-07-12 22:27:32.764764] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.680 "name": "Existed_Raid", 00:20:22.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.680 "strip_size_kb": 64, 00:20:22.680 "state": "configuring", 00:20:22.680 "raid_level": "concat", 00:20:22.680 "superblock": false, 00:20:22.680 "num_base_bdevs": 4, 00:20:22.680 "num_base_bdevs_discovered": 2, 00:20:22.680 "num_base_bdevs_operational": 4, 00:20:22.680 "base_bdevs_list": [ 00:20:22.680 { 00:20:22.680 "name": "BaseBdev1", 00:20:22.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.680 "is_configured": false, 00:20:22.680 "data_offset": 0, 00:20:22.680 "data_size": 0 00:20:22.680 }, 00:20:22.680 { 00:20:22.680 "name": null, 00:20:22.680 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:22.680 "is_configured": false, 00:20:22.680 "data_offset": 0, 00:20:22.680 "data_size": 65536 00:20:22.680 }, 00:20:22.680 { 00:20:22.680 "name": "BaseBdev3", 00:20:22.680 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:22.680 "is_configured": true, 00:20:22.680 "data_offset": 0, 00:20:22.680 "data_size": 65536 00:20:22.680 }, 00:20:22.680 { 00:20:22.680 "name": "BaseBdev4", 00:20:22.680 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:22.680 "is_configured": true, 00:20:22.680 "data_offset": 0, 00:20:22.680 "data_size": 65536 00:20:22.680 } 00:20:22.680 ] 00:20:22.680 }' 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.680 22:27:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.248 22:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.248 22:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:23.504 22:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:23.504 22:27:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:23.762 [2024-07-12 22:27:34.032759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:23.762 BaseBdev1 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:23.762 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.020 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:24.277 [ 00:20:24.277 { 00:20:24.277 "name": "BaseBdev1", 00:20:24.277 "aliases": [ 00:20:24.277 "f8254087-2dae-40bb-a0bd-3ba71b6b8994" 00:20:24.277 ], 00:20:24.277 "product_name": "Malloc disk", 00:20:24.277 "block_size": 512, 00:20:24.277 "num_blocks": 65536, 00:20:24.277 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:24.277 "assigned_rate_limits": { 00:20:24.277 "rw_ios_per_sec": 0, 00:20:24.277 "rw_mbytes_per_sec": 0, 00:20:24.277 "r_mbytes_per_sec": 0, 00:20:24.277 "w_mbytes_per_sec": 0 00:20:24.277 }, 00:20:24.277 "claimed": true, 00:20:24.277 "claim_type": "exclusive_write", 00:20:24.277 "zoned": false, 00:20:24.277 "supported_io_types": { 00:20:24.277 "read": true, 00:20:24.277 "write": true, 00:20:24.277 "unmap": true, 00:20:24.277 "flush": true, 00:20:24.277 "reset": true, 00:20:24.277 "nvme_admin": false, 00:20:24.277 "nvme_io": false, 00:20:24.277 "nvme_io_md": false, 00:20:24.277 "write_zeroes": true, 00:20:24.277 "zcopy": true, 00:20:24.277 "get_zone_info": false, 00:20:24.278 "zone_management": false, 00:20:24.278 "zone_append": false, 00:20:24.278 "compare": false, 00:20:24.278 "compare_and_write": false, 00:20:24.278 "abort": true, 00:20:24.278 "seek_hole": false, 00:20:24.278 "seek_data": false, 00:20:24.278 "copy": true, 00:20:24.278 "nvme_iov_md": false 00:20:24.278 }, 00:20:24.278 "memory_domains": [ 00:20:24.278 { 00:20:24.278 "dma_device_id": "system", 00:20:24.278 "dma_device_type": 1 00:20:24.278 }, 00:20:24.278 { 00:20:24.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.278 "dma_device_type": 2 00:20:24.278 } 00:20:24.278 ], 00:20:24.278 "driver_specific": {} 00:20:24.278 } 00:20:24.278 ] 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.278 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.536 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.536 "name": "Existed_Raid", 00:20:24.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.536 "strip_size_kb": 64, 00:20:24.536 "state": "configuring", 00:20:24.536 "raid_level": "concat", 00:20:24.536 "superblock": false, 00:20:24.536 "num_base_bdevs": 4, 00:20:24.536 "num_base_bdevs_discovered": 3, 00:20:24.536 "num_base_bdevs_operational": 4, 00:20:24.536 "base_bdevs_list": [ 00:20:24.536 { 00:20:24.536 "name": "BaseBdev1", 00:20:24.536 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:24.536 "is_configured": true, 00:20:24.536 "data_offset": 0, 00:20:24.536 "data_size": 65536 00:20:24.536 }, 00:20:24.536 { 00:20:24.536 "name": null, 00:20:24.536 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:24.536 "is_configured": false, 00:20:24.536 "data_offset": 0, 00:20:24.536 "data_size": 65536 00:20:24.536 }, 00:20:24.536 { 00:20:24.536 "name": "BaseBdev3", 00:20:24.536 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:24.536 "is_configured": true, 00:20:24.536 "data_offset": 0, 00:20:24.536 "data_size": 65536 00:20:24.536 }, 00:20:24.536 { 00:20:24.536 "name": "BaseBdev4", 00:20:24.536 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:24.536 "is_configured": true, 00:20:24.536 "data_offset": 0, 00:20:24.536 "data_size": 65536 00:20:24.536 } 00:20:24.536 ] 00:20:24.536 }' 00:20:24.536 22:27:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.536 22:27:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.101 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.101 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:25.359 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:25.360 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:25.618 [2024-07-12 22:27:35.817526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.618 22:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.876 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.876 "name": "Existed_Raid", 00:20:25.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.876 "strip_size_kb": 64, 00:20:25.876 "state": "configuring", 00:20:25.876 "raid_level": "concat", 00:20:25.876 "superblock": false, 00:20:25.876 "num_base_bdevs": 4, 00:20:25.876 "num_base_bdevs_discovered": 2, 00:20:25.876 "num_base_bdevs_operational": 4, 00:20:25.876 "base_bdevs_list": [ 00:20:25.876 { 00:20:25.876 "name": "BaseBdev1", 00:20:25.876 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:25.876 "is_configured": true, 00:20:25.876 "data_offset": 0, 00:20:25.876 "data_size": 65536 00:20:25.876 }, 00:20:25.876 { 00:20:25.876 "name": null, 00:20:25.876 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:25.876 "is_configured": false, 00:20:25.876 "data_offset": 0, 00:20:25.876 "data_size": 65536 00:20:25.876 }, 00:20:25.876 { 00:20:25.876 "name": null, 00:20:25.876 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:25.876 "is_configured": false, 00:20:25.876 "data_offset": 0, 00:20:25.876 "data_size": 65536 00:20:25.876 }, 00:20:25.876 { 00:20:25.876 "name": "BaseBdev4", 00:20:25.876 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:25.876 "is_configured": true, 00:20:25.876 "data_offset": 0, 00:20:25.876 "data_size": 65536 00:20:25.876 } 00:20:25.876 ] 00:20:25.876 }' 00:20:25.876 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.876 22:27:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.441 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.441 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:26.698 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:26.698 22:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:26.698 [2024-07-12 22:27:36.996671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.956 "name": "Existed_Raid", 00:20:26.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.956 "strip_size_kb": 64, 00:20:26.956 "state": "configuring", 00:20:26.956 "raid_level": "concat", 00:20:26.956 "superblock": false, 00:20:26.956 "num_base_bdevs": 4, 00:20:26.956 "num_base_bdevs_discovered": 3, 00:20:26.956 "num_base_bdevs_operational": 4, 00:20:26.956 "base_bdevs_list": [ 00:20:26.956 { 00:20:26.956 "name": "BaseBdev1", 00:20:26.956 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:26.956 "is_configured": true, 00:20:26.956 "data_offset": 0, 00:20:26.956 "data_size": 65536 00:20:26.956 }, 00:20:26.956 { 00:20:26.956 "name": null, 00:20:26.956 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:26.956 "is_configured": false, 00:20:26.956 "data_offset": 0, 00:20:26.956 "data_size": 65536 00:20:26.956 }, 00:20:26.956 { 00:20:26.956 "name": "BaseBdev3", 00:20:26.956 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:26.956 "is_configured": true, 00:20:26.956 "data_offset": 0, 00:20:26.956 "data_size": 65536 00:20:26.956 }, 00:20:26.956 { 00:20:26.956 "name": "BaseBdev4", 00:20:26.956 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:26.956 "is_configured": true, 00:20:26.956 "data_offset": 0, 00:20:26.956 "data_size": 65536 00:20:26.956 } 00:20:26.956 ] 00:20:26.956 }' 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.956 22:27:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.891 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.891 22:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:27.892 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:27.892 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:28.151 [2024-07-12 22:27:38.276075] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.151 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.410 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.410 "name": "Existed_Raid", 00:20:28.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.410 "strip_size_kb": 64, 00:20:28.410 "state": "configuring", 00:20:28.410 "raid_level": "concat", 00:20:28.410 "superblock": false, 00:20:28.410 "num_base_bdevs": 4, 00:20:28.410 "num_base_bdevs_discovered": 2, 00:20:28.410 "num_base_bdevs_operational": 4, 00:20:28.410 "base_bdevs_list": [ 00:20:28.410 { 00:20:28.410 "name": null, 00:20:28.410 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:28.410 "is_configured": false, 00:20:28.410 "data_offset": 0, 00:20:28.410 "data_size": 65536 00:20:28.410 }, 00:20:28.410 { 00:20:28.410 "name": null, 00:20:28.410 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:28.410 "is_configured": false, 00:20:28.410 "data_offset": 0, 00:20:28.410 "data_size": 65536 00:20:28.410 }, 00:20:28.410 { 00:20:28.410 "name": "BaseBdev3", 00:20:28.410 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:28.410 "is_configured": true, 00:20:28.410 "data_offset": 0, 00:20:28.410 "data_size": 65536 00:20:28.410 }, 00:20:28.410 { 00:20:28.410 "name": "BaseBdev4", 00:20:28.410 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:28.410 "is_configured": true, 00:20:28.410 "data_offset": 0, 00:20:28.410 "data_size": 65536 00:20:28.410 } 00:20:28.410 ] 00:20:28.410 }' 00:20:28.410 22:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.410 22:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.977 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.977 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:29.235 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:29.235 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:29.493 [2024-07-12 22:27:39.630145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.493 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.752 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.752 "name": "Existed_Raid", 00:20:29.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.752 "strip_size_kb": 64, 00:20:29.752 "state": "configuring", 00:20:29.752 "raid_level": "concat", 00:20:29.752 "superblock": false, 00:20:29.752 "num_base_bdevs": 4, 00:20:29.752 "num_base_bdevs_discovered": 3, 00:20:29.752 "num_base_bdevs_operational": 4, 00:20:29.752 "base_bdevs_list": [ 00:20:29.752 { 00:20:29.752 "name": null, 00:20:29.752 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:29.752 "is_configured": false, 00:20:29.752 "data_offset": 0, 00:20:29.752 "data_size": 65536 00:20:29.752 }, 00:20:29.752 { 00:20:29.752 "name": "BaseBdev2", 00:20:29.752 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:29.752 "is_configured": true, 00:20:29.752 "data_offset": 0, 00:20:29.752 "data_size": 65536 00:20:29.752 }, 00:20:29.752 { 00:20:29.752 "name": "BaseBdev3", 00:20:29.752 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:29.752 "is_configured": true, 00:20:29.752 "data_offset": 0, 00:20:29.752 "data_size": 65536 00:20:29.752 }, 00:20:29.752 { 00:20:29.752 "name": "BaseBdev4", 00:20:29.752 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:29.752 "is_configured": true, 00:20:29.752 "data_offset": 0, 00:20:29.752 "data_size": 65536 00:20:29.752 } 00:20:29.752 ] 00:20:29.752 }' 00:20:29.752 22:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.752 22:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.319 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.319 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:30.578 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:30.578 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.578 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:30.836 22:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f8254087-2dae-40bb-a0bd-3ba71b6b8994 00:20:31.095 [2024-07-12 22:27:41.222888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:31.095 [2024-07-12 22:27:41.222940] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c4040 00:20:31.095 [2024-07-12 22:27:41.222949] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:31.095 [2024-07-12 22:27:41.223147] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16bfa70 00:20:31.095 [2024-07-12 22:27:41.223264] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c4040 00:20:31.095 [2024-07-12 22:27:41.223276] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16c4040 00:20:31.095 [2024-07-12 22:27:41.223439] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.095 NewBaseBdev 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:31.095 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:31.353 [ 00:20:31.353 { 00:20:31.353 "name": "NewBaseBdev", 00:20:31.353 "aliases": [ 00:20:31.353 "f8254087-2dae-40bb-a0bd-3ba71b6b8994" 00:20:31.353 ], 00:20:31.353 "product_name": "Malloc disk", 00:20:31.353 "block_size": 512, 00:20:31.353 "num_blocks": 65536, 00:20:31.353 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:31.353 "assigned_rate_limits": { 00:20:31.353 "rw_ios_per_sec": 0, 00:20:31.353 "rw_mbytes_per_sec": 0, 00:20:31.353 "r_mbytes_per_sec": 0, 00:20:31.353 "w_mbytes_per_sec": 0 00:20:31.353 }, 00:20:31.353 "claimed": true, 00:20:31.353 "claim_type": "exclusive_write", 00:20:31.353 "zoned": false, 00:20:31.353 "supported_io_types": { 00:20:31.353 "read": true, 00:20:31.353 "write": true, 00:20:31.353 "unmap": true, 00:20:31.353 "flush": true, 00:20:31.353 "reset": true, 00:20:31.353 "nvme_admin": false, 00:20:31.353 "nvme_io": false, 00:20:31.353 "nvme_io_md": false, 00:20:31.353 "write_zeroes": true, 00:20:31.353 "zcopy": true, 00:20:31.353 "get_zone_info": false, 00:20:31.353 "zone_management": false, 00:20:31.353 "zone_append": false, 00:20:31.353 "compare": false, 00:20:31.353 "compare_and_write": false, 00:20:31.353 "abort": true, 00:20:31.353 "seek_hole": false, 00:20:31.353 "seek_data": false, 00:20:31.354 "copy": true, 00:20:31.354 "nvme_iov_md": false 00:20:31.354 }, 00:20:31.354 "memory_domains": [ 00:20:31.354 { 00:20:31.354 "dma_device_id": "system", 00:20:31.354 "dma_device_type": 1 00:20:31.354 }, 00:20:31.354 { 00:20:31.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.354 "dma_device_type": 2 00:20:31.354 } 00:20:31.354 ], 00:20:31.354 "driver_specific": {} 00:20:31.354 } 00:20:31.354 ] 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.354 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.612 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.612 "name": "Existed_Raid", 00:20:31.612 "uuid": "1d1a3d03-5a32-48bb-adb6-e161e132926e", 00:20:31.612 "strip_size_kb": 64, 00:20:31.612 "state": "online", 00:20:31.612 "raid_level": "concat", 00:20:31.612 "superblock": false, 00:20:31.612 "num_base_bdevs": 4, 00:20:31.612 "num_base_bdevs_discovered": 4, 00:20:31.612 "num_base_bdevs_operational": 4, 00:20:31.612 "base_bdevs_list": [ 00:20:31.612 { 00:20:31.612 "name": "NewBaseBdev", 00:20:31.612 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:31.612 "is_configured": true, 00:20:31.612 "data_offset": 0, 00:20:31.612 "data_size": 65536 00:20:31.612 }, 00:20:31.612 { 00:20:31.612 "name": "BaseBdev2", 00:20:31.612 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:31.612 "is_configured": true, 00:20:31.612 "data_offset": 0, 00:20:31.612 "data_size": 65536 00:20:31.612 }, 00:20:31.612 { 00:20:31.612 "name": "BaseBdev3", 00:20:31.612 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:31.612 "is_configured": true, 00:20:31.612 "data_offset": 0, 00:20:31.612 "data_size": 65536 00:20:31.612 }, 00:20:31.612 { 00:20:31.612 "name": "BaseBdev4", 00:20:31.612 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:31.612 "is_configured": true, 00:20:31.612 "data_offset": 0, 00:20:31.612 "data_size": 65536 00:20:31.612 } 00:20:31.612 ] 00:20:31.612 }' 00:20:31.612 22:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.612 22:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:32.179 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:32.437 [2024-07-12 22:27:42.679090] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:32.437 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:32.437 "name": "Existed_Raid", 00:20:32.437 "aliases": [ 00:20:32.437 "1d1a3d03-5a32-48bb-adb6-e161e132926e" 00:20:32.437 ], 00:20:32.437 "product_name": "Raid Volume", 00:20:32.437 "block_size": 512, 00:20:32.437 "num_blocks": 262144, 00:20:32.437 "uuid": "1d1a3d03-5a32-48bb-adb6-e161e132926e", 00:20:32.437 "assigned_rate_limits": { 00:20:32.437 "rw_ios_per_sec": 0, 00:20:32.437 "rw_mbytes_per_sec": 0, 00:20:32.437 "r_mbytes_per_sec": 0, 00:20:32.437 "w_mbytes_per_sec": 0 00:20:32.437 }, 00:20:32.437 "claimed": false, 00:20:32.437 "zoned": false, 00:20:32.437 "supported_io_types": { 00:20:32.437 "read": true, 00:20:32.437 "write": true, 00:20:32.437 "unmap": true, 00:20:32.437 "flush": true, 00:20:32.437 "reset": true, 00:20:32.437 "nvme_admin": false, 00:20:32.437 "nvme_io": false, 00:20:32.437 "nvme_io_md": false, 00:20:32.437 "write_zeroes": true, 00:20:32.437 "zcopy": false, 00:20:32.437 "get_zone_info": false, 00:20:32.437 "zone_management": false, 00:20:32.437 "zone_append": false, 00:20:32.437 "compare": false, 00:20:32.437 "compare_and_write": false, 00:20:32.437 "abort": false, 00:20:32.437 "seek_hole": false, 00:20:32.437 "seek_data": false, 00:20:32.437 "copy": false, 00:20:32.437 "nvme_iov_md": false 00:20:32.437 }, 00:20:32.437 "memory_domains": [ 00:20:32.437 { 00:20:32.437 "dma_device_id": "system", 00:20:32.437 "dma_device_type": 1 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.437 "dma_device_type": 2 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "system", 00:20:32.437 "dma_device_type": 1 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.437 "dma_device_type": 2 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "system", 00:20:32.437 "dma_device_type": 1 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.437 "dma_device_type": 2 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "system", 00:20:32.437 "dma_device_type": 1 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.437 "dma_device_type": 2 00:20:32.437 } 00:20:32.437 ], 00:20:32.437 "driver_specific": { 00:20:32.437 "raid": { 00:20:32.437 "uuid": "1d1a3d03-5a32-48bb-adb6-e161e132926e", 00:20:32.437 "strip_size_kb": 64, 00:20:32.437 "state": "online", 00:20:32.437 "raid_level": "concat", 00:20:32.437 "superblock": false, 00:20:32.437 "num_base_bdevs": 4, 00:20:32.437 "num_base_bdevs_discovered": 4, 00:20:32.437 "num_base_bdevs_operational": 4, 00:20:32.437 "base_bdevs_list": [ 00:20:32.437 { 00:20:32.437 "name": "NewBaseBdev", 00:20:32.437 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:32.437 "is_configured": true, 00:20:32.437 "data_offset": 0, 00:20:32.437 "data_size": 65536 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "name": "BaseBdev2", 00:20:32.437 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:32.437 "is_configured": true, 00:20:32.437 "data_offset": 0, 00:20:32.437 "data_size": 65536 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "name": "BaseBdev3", 00:20:32.437 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:32.437 "is_configured": true, 00:20:32.437 "data_offset": 0, 00:20:32.437 "data_size": 65536 00:20:32.437 }, 00:20:32.437 { 00:20:32.437 "name": "BaseBdev4", 00:20:32.437 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:32.437 "is_configured": true, 00:20:32.437 "data_offset": 0, 00:20:32.437 "data_size": 65536 00:20:32.437 } 00:20:32.437 ] 00:20:32.437 } 00:20:32.437 } 00:20:32.437 }' 00:20:32.437 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:32.705 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:32.705 BaseBdev2 00:20:32.705 BaseBdev3 00:20:32.705 BaseBdev4' 00:20:32.705 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.705 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:32.705 22:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.705 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.705 "name": "NewBaseBdev", 00:20:32.705 "aliases": [ 00:20:32.705 "f8254087-2dae-40bb-a0bd-3ba71b6b8994" 00:20:32.705 ], 00:20:32.705 "product_name": "Malloc disk", 00:20:32.705 "block_size": 512, 00:20:32.705 "num_blocks": 65536, 00:20:32.705 "uuid": "f8254087-2dae-40bb-a0bd-3ba71b6b8994", 00:20:32.705 "assigned_rate_limits": { 00:20:32.705 "rw_ios_per_sec": 0, 00:20:32.705 "rw_mbytes_per_sec": 0, 00:20:32.705 "r_mbytes_per_sec": 0, 00:20:32.705 "w_mbytes_per_sec": 0 00:20:32.705 }, 00:20:32.705 "claimed": true, 00:20:32.705 "claim_type": "exclusive_write", 00:20:32.705 "zoned": false, 00:20:32.705 "supported_io_types": { 00:20:32.705 "read": true, 00:20:32.705 "write": true, 00:20:32.705 "unmap": true, 00:20:32.705 "flush": true, 00:20:32.705 "reset": true, 00:20:32.705 "nvme_admin": false, 00:20:32.705 "nvme_io": false, 00:20:32.705 "nvme_io_md": false, 00:20:32.705 "write_zeroes": true, 00:20:32.705 "zcopy": true, 00:20:32.705 "get_zone_info": false, 00:20:32.705 "zone_management": false, 00:20:32.705 "zone_append": false, 00:20:32.705 "compare": false, 00:20:32.705 "compare_and_write": false, 00:20:32.705 "abort": true, 00:20:32.705 "seek_hole": false, 00:20:32.705 "seek_data": false, 00:20:32.705 "copy": true, 00:20:32.705 "nvme_iov_md": false 00:20:32.705 }, 00:20:32.705 "memory_domains": [ 00:20:32.705 { 00:20:32.705 "dma_device_id": "system", 00:20:32.705 "dma_device_type": 1 00:20:32.705 }, 00:20:32.705 { 00:20:32.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.705 "dma_device_type": 2 00:20:32.705 } 00:20:32.705 ], 00:20:32.705 "driver_specific": {} 00:20:32.705 }' 00:20:32.705 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.962 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.220 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.220 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.220 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:33.220 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:33.220 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:33.478 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:33.478 "name": "BaseBdev2", 00:20:33.478 "aliases": [ 00:20:33.478 "a57aa934-f8fd-4989-acc6-edf1fc35902e" 00:20:33.478 ], 00:20:33.478 "product_name": "Malloc disk", 00:20:33.478 "block_size": 512, 00:20:33.478 "num_blocks": 65536, 00:20:33.478 "uuid": "a57aa934-f8fd-4989-acc6-edf1fc35902e", 00:20:33.478 "assigned_rate_limits": { 00:20:33.478 "rw_ios_per_sec": 0, 00:20:33.478 "rw_mbytes_per_sec": 0, 00:20:33.478 "r_mbytes_per_sec": 0, 00:20:33.478 "w_mbytes_per_sec": 0 00:20:33.478 }, 00:20:33.478 "claimed": true, 00:20:33.478 "claim_type": "exclusive_write", 00:20:33.478 "zoned": false, 00:20:33.478 "supported_io_types": { 00:20:33.478 "read": true, 00:20:33.478 "write": true, 00:20:33.478 "unmap": true, 00:20:33.478 "flush": true, 00:20:33.478 "reset": true, 00:20:33.478 "nvme_admin": false, 00:20:33.478 "nvme_io": false, 00:20:33.478 "nvme_io_md": false, 00:20:33.478 "write_zeroes": true, 00:20:33.478 "zcopy": true, 00:20:33.478 "get_zone_info": false, 00:20:33.478 "zone_management": false, 00:20:33.478 "zone_append": false, 00:20:33.478 "compare": false, 00:20:33.478 "compare_and_write": false, 00:20:33.478 "abort": true, 00:20:33.478 "seek_hole": false, 00:20:33.478 "seek_data": false, 00:20:33.478 "copy": true, 00:20:33.478 "nvme_iov_md": false 00:20:33.478 }, 00:20:33.478 "memory_domains": [ 00:20:33.478 { 00:20:33.478 "dma_device_id": "system", 00:20:33.479 "dma_device_type": 1 00:20:33.479 }, 00:20:33.479 { 00:20:33.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.479 "dma_device_type": 2 00:20:33.479 } 00:20:33.479 ], 00:20:33.479 "driver_specific": {} 00:20:33.479 }' 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:33.479 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:33.736 22:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.301 "name": "BaseBdev3", 00:20:34.301 "aliases": [ 00:20:34.301 "c3ea81c3-a5de-47dc-8874-f5e115189401" 00:20:34.301 ], 00:20:34.301 "product_name": "Malloc disk", 00:20:34.301 "block_size": 512, 00:20:34.301 "num_blocks": 65536, 00:20:34.301 "uuid": "c3ea81c3-a5de-47dc-8874-f5e115189401", 00:20:34.301 "assigned_rate_limits": { 00:20:34.301 "rw_ios_per_sec": 0, 00:20:34.301 "rw_mbytes_per_sec": 0, 00:20:34.301 "r_mbytes_per_sec": 0, 00:20:34.301 "w_mbytes_per_sec": 0 00:20:34.301 }, 00:20:34.301 "claimed": true, 00:20:34.301 "claim_type": "exclusive_write", 00:20:34.301 "zoned": false, 00:20:34.301 "supported_io_types": { 00:20:34.301 "read": true, 00:20:34.301 "write": true, 00:20:34.301 "unmap": true, 00:20:34.301 "flush": true, 00:20:34.301 "reset": true, 00:20:34.301 "nvme_admin": false, 00:20:34.301 "nvme_io": false, 00:20:34.301 "nvme_io_md": false, 00:20:34.301 "write_zeroes": true, 00:20:34.301 "zcopy": true, 00:20:34.301 "get_zone_info": false, 00:20:34.301 "zone_management": false, 00:20:34.301 "zone_append": false, 00:20:34.301 "compare": false, 00:20:34.301 "compare_and_write": false, 00:20:34.301 "abort": true, 00:20:34.301 "seek_hole": false, 00:20:34.301 "seek_data": false, 00:20:34.301 "copy": true, 00:20:34.301 "nvme_iov_md": false 00:20:34.301 }, 00:20:34.301 "memory_domains": [ 00:20:34.301 { 00:20:34.301 "dma_device_id": "system", 00:20:34.301 "dma_device_type": 1 00:20:34.301 }, 00:20:34.301 { 00:20:34.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.301 "dma_device_type": 2 00:20:34.301 } 00:20:34.301 ], 00:20:34.301 "driver_specific": {} 00:20:34.301 }' 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:34.301 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:34.558 22:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.815 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.815 "name": "BaseBdev4", 00:20:34.815 "aliases": [ 00:20:34.815 "d2c93983-cdcc-4b76-adc0-4c07860e5ba7" 00:20:34.815 ], 00:20:34.815 "product_name": "Malloc disk", 00:20:34.815 "block_size": 512, 00:20:34.815 "num_blocks": 65536, 00:20:34.815 "uuid": "d2c93983-cdcc-4b76-adc0-4c07860e5ba7", 00:20:34.815 "assigned_rate_limits": { 00:20:34.815 "rw_ios_per_sec": 0, 00:20:34.815 "rw_mbytes_per_sec": 0, 00:20:34.815 "r_mbytes_per_sec": 0, 00:20:34.815 "w_mbytes_per_sec": 0 00:20:34.815 }, 00:20:34.815 "claimed": true, 00:20:34.816 "claim_type": "exclusive_write", 00:20:34.816 "zoned": false, 00:20:34.816 "supported_io_types": { 00:20:34.816 "read": true, 00:20:34.816 "write": true, 00:20:34.816 "unmap": true, 00:20:34.816 "flush": true, 00:20:34.816 "reset": true, 00:20:34.816 "nvme_admin": false, 00:20:34.816 "nvme_io": false, 00:20:34.816 "nvme_io_md": false, 00:20:34.816 "write_zeroes": true, 00:20:34.816 "zcopy": true, 00:20:34.816 "get_zone_info": false, 00:20:34.816 "zone_management": false, 00:20:34.816 "zone_append": false, 00:20:34.816 "compare": false, 00:20:34.816 "compare_and_write": false, 00:20:34.816 "abort": true, 00:20:34.816 "seek_hole": false, 00:20:34.816 "seek_data": false, 00:20:34.816 "copy": true, 00:20:34.816 "nvme_iov_md": false 00:20:34.816 }, 00:20:34.816 "memory_domains": [ 00:20:34.816 { 00:20:34.816 "dma_device_id": "system", 00:20:34.816 "dma_device_type": 1 00:20:34.816 }, 00:20:34.816 { 00:20:34.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.816 "dma_device_type": 2 00:20:34.816 } 00:20:34.816 ], 00:20:34.816 "driver_specific": {} 00:20:34.816 }' 00:20:34.816 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.816 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.816 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.816 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.074 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:35.333 [2024-07-12 22:27:45.574468] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:35.334 [2024-07-12 22:27:45.574494] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:35.334 [2024-07-12 22:27:45.574546] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:35.334 [2024-07-12 22:27:45.574609] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:35.334 [2024-07-12 22:27:45.574621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c4040 name Existed_Raid, state offline 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3496834 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3496834 ']' 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3496834 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3496834 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3496834' 00:20:35.334 killing process with pid 3496834 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3496834 00:20:35.334 [2024-07-12 22:27:45.639036] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:35.334 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3496834 00:20:35.644 [2024-07-12 22:27:45.676805] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:35.644 22:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:35.644 00:20:35.644 real 0m32.397s 00:20:35.644 user 0m59.437s 00:20:35.644 sys 0m5.790s 00:20:35.644 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:35.644 22:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:35.644 ************************************ 00:20:35.644 END TEST raid_state_function_test 00:20:35.644 ************************************ 00:20:35.644 22:27:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:35.644 22:27:45 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:35.644 22:27:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:35.644 22:27:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:35.644 22:27:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:35.915 ************************************ 00:20:35.916 START TEST raid_state_function_test_sb 00:20:35.916 ************************************ 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3501719 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3501719' 00:20:35.916 Process raid pid: 3501719 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3501719 /var/tmp/spdk-raid.sock 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3501719 ']' 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:35.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:35.916 22:27:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.916 [2024-07-12 22:27:46.037726] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:20:35.916 [2024-07-12 22:27:46.037799] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:35.916 [2024-07-12 22:27:46.167550] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:36.173 [2024-07-12 22:27:46.275432] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:36.173 [2024-07-12 22:27:46.342146] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:36.173 [2024-07-12 22:27:46.342175] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:36.740 22:27:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:36.740 22:27:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:36.740 22:27:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:36.999 [2024-07-12 22:27:47.160642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:36.999 [2024-07-12 22:27:47.160682] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:36.999 [2024-07-12 22:27:47.160693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:36.999 [2024-07-12 22:27:47.160705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:36.999 [2024-07-12 22:27:47.160714] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:36.999 [2024-07-12 22:27:47.160724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:36.999 [2024-07-12 22:27:47.160733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:36.999 [2024-07-12 22:27:47.160744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.999 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:37.566 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.566 "name": "Existed_Raid", 00:20:37.566 "uuid": "52996e72-9f2d-47cb-a387-a19c675a53f8", 00:20:37.566 "strip_size_kb": 64, 00:20:37.566 "state": "configuring", 00:20:37.566 "raid_level": "concat", 00:20:37.566 "superblock": true, 00:20:37.566 "num_base_bdevs": 4, 00:20:37.566 "num_base_bdevs_discovered": 0, 00:20:37.566 "num_base_bdevs_operational": 4, 00:20:37.566 "base_bdevs_list": [ 00:20:37.566 { 00:20:37.566 "name": "BaseBdev1", 00:20:37.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.566 "is_configured": false, 00:20:37.566 "data_offset": 0, 00:20:37.566 "data_size": 0 00:20:37.566 }, 00:20:37.566 { 00:20:37.566 "name": "BaseBdev2", 00:20:37.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.566 "is_configured": false, 00:20:37.566 "data_offset": 0, 00:20:37.566 "data_size": 0 00:20:37.566 }, 00:20:37.566 { 00:20:37.566 "name": "BaseBdev3", 00:20:37.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.566 "is_configured": false, 00:20:37.566 "data_offset": 0, 00:20:37.566 "data_size": 0 00:20:37.566 }, 00:20:37.566 { 00:20:37.566 "name": "BaseBdev4", 00:20:37.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.566 "is_configured": false, 00:20:37.566 "data_offset": 0, 00:20:37.566 "data_size": 0 00:20:37.566 } 00:20:37.566 ] 00:20:37.566 }' 00:20:37.566 22:27:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.567 22:27:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.134 22:27:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:38.392 [2024-07-12 22:27:48.516047] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:38.392 [2024-07-12 22:27:48.516078] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1953aa0 name Existed_Raid, state configuring 00:20:38.392 22:27:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:38.650 [2024-07-12 22:27:48.760725] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:38.650 [2024-07-12 22:27:48.760754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:38.650 [2024-07-12 22:27:48.760764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:38.650 [2024-07-12 22:27:48.760776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:38.650 [2024-07-12 22:27:48.760784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:38.650 [2024-07-12 22:27:48.760795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:38.650 [2024-07-12 22:27:48.760804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:38.650 [2024-07-12 22:27:48.760824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:38.650 22:27:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:38.908 [2024-07-12 22:27:49.011386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:38.908 BaseBdev1 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:38.908 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:39.166 [ 00:20:39.166 { 00:20:39.166 "name": "BaseBdev1", 00:20:39.166 "aliases": [ 00:20:39.166 "e18245ba-1b85-4914-b593-97794f617647" 00:20:39.166 ], 00:20:39.166 "product_name": "Malloc disk", 00:20:39.166 "block_size": 512, 00:20:39.166 "num_blocks": 65536, 00:20:39.166 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:39.166 "assigned_rate_limits": { 00:20:39.166 "rw_ios_per_sec": 0, 00:20:39.166 "rw_mbytes_per_sec": 0, 00:20:39.166 "r_mbytes_per_sec": 0, 00:20:39.166 "w_mbytes_per_sec": 0 00:20:39.166 }, 00:20:39.166 "claimed": true, 00:20:39.166 "claim_type": "exclusive_write", 00:20:39.166 "zoned": false, 00:20:39.166 "supported_io_types": { 00:20:39.166 "read": true, 00:20:39.166 "write": true, 00:20:39.166 "unmap": true, 00:20:39.166 "flush": true, 00:20:39.166 "reset": true, 00:20:39.166 "nvme_admin": false, 00:20:39.166 "nvme_io": false, 00:20:39.166 "nvme_io_md": false, 00:20:39.166 "write_zeroes": true, 00:20:39.166 "zcopy": true, 00:20:39.166 "get_zone_info": false, 00:20:39.166 "zone_management": false, 00:20:39.166 "zone_append": false, 00:20:39.166 "compare": false, 00:20:39.166 "compare_and_write": false, 00:20:39.166 "abort": true, 00:20:39.166 "seek_hole": false, 00:20:39.166 "seek_data": false, 00:20:39.166 "copy": true, 00:20:39.166 "nvme_iov_md": false 00:20:39.166 }, 00:20:39.166 "memory_domains": [ 00:20:39.166 { 00:20:39.166 "dma_device_id": "system", 00:20:39.166 "dma_device_type": 1 00:20:39.166 }, 00:20:39.166 { 00:20:39.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.166 "dma_device_type": 2 00:20:39.166 } 00:20:39.166 ], 00:20:39.166 "driver_specific": {} 00:20:39.166 } 00:20:39.166 ] 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.166 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.424 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.424 "name": "Existed_Raid", 00:20:39.424 "uuid": "917adf56-0078-49d7-aef2-ebe96686398f", 00:20:39.424 "strip_size_kb": 64, 00:20:39.424 "state": "configuring", 00:20:39.424 "raid_level": "concat", 00:20:39.424 "superblock": true, 00:20:39.424 "num_base_bdevs": 4, 00:20:39.424 "num_base_bdevs_discovered": 1, 00:20:39.424 "num_base_bdevs_operational": 4, 00:20:39.424 "base_bdevs_list": [ 00:20:39.424 { 00:20:39.424 "name": "BaseBdev1", 00:20:39.424 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:39.424 "is_configured": true, 00:20:39.424 "data_offset": 2048, 00:20:39.424 "data_size": 63488 00:20:39.424 }, 00:20:39.424 { 00:20:39.424 "name": "BaseBdev2", 00:20:39.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.424 "is_configured": false, 00:20:39.424 "data_offset": 0, 00:20:39.424 "data_size": 0 00:20:39.424 }, 00:20:39.424 { 00:20:39.424 "name": "BaseBdev3", 00:20:39.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.424 "is_configured": false, 00:20:39.424 "data_offset": 0, 00:20:39.424 "data_size": 0 00:20:39.424 }, 00:20:39.424 { 00:20:39.424 "name": "BaseBdev4", 00:20:39.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.424 "is_configured": false, 00:20:39.424 "data_offset": 0, 00:20:39.424 "data_size": 0 00:20:39.424 } 00:20:39.424 ] 00:20:39.424 }' 00:20:39.424 22:27:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.424 22:27:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.359 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:40.359 [2024-07-12 22:27:50.551496] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:40.359 [2024-07-12 22:27:50.551539] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1953310 name Existed_Raid, state configuring 00:20:40.359 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:40.617 [2024-07-12 22:27:50.800200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:40.617 [2024-07-12 22:27:50.801648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:40.617 [2024-07-12 22:27:50.801680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:40.617 [2024-07-12 22:27:50.801690] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:40.617 [2024-07-12 22:27:50.801702] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:40.617 [2024-07-12 22:27:50.801711] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:40.617 [2024-07-12 22:27:50.801722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.617 22:27:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.876 22:27:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.876 "name": "Existed_Raid", 00:20:40.876 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:40.876 "strip_size_kb": 64, 00:20:40.876 "state": "configuring", 00:20:40.876 "raid_level": "concat", 00:20:40.876 "superblock": true, 00:20:40.876 "num_base_bdevs": 4, 00:20:40.876 "num_base_bdevs_discovered": 1, 00:20:40.876 "num_base_bdevs_operational": 4, 00:20:40.876 "base_bdevs_list": [ 00:20:40.876 { 00:20:40.876 "name": "BaseBdev1", 00:20:40.876 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:40.876 "is_configured": true, 00:20:40.876 "data_offset": 2048, 00:20:40.876 "data_size": 63488 00:20:40.876 }, 00:20:40.876 { 00:20:40.876 "name": "BaseBdev2", 00:20:40.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.876 "is_configured": false, 00:20:40.876 "data_offset": 0, 00:20:40.876 "data_size": 0 00:20:40.876 }, 00:20:40.876 { 00:20:40.876 "name": "BaseBdev3", 00:20:40.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.876 "is_configured": false, 00:20:40.876 "data_offset": 0, 00:20:40.876 "data_size": 0 00:20:40.876 }, 00:20:40.876 { 00:20:40.876 "name": "BaseBdev4", 00:20:40.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.876 "is_configured": false, 00:20:40.876 "data_offset": 0, 00:20:40.876 "data_size": 0 00:20:40.876 } 00:20:40.876 ] 00:20:40.876 }' 00:20:40.876 22:27:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.876 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.442 22:27:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:41.700 [2024-07-12 22:27:51.890638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:41.700 BaseBdev2 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:41.700 22:27:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:41.958 22:27:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:42.216 [ 00:20:42.216 { 00:20:42.216 "name": "BaseBdev2", 00:20:42.216 "aliases": [ 00:20:42.216 "e3b14892-813a-4084-aabb-549f6402454d" 00:20:42.216 ], 00:20:42.216 "product_name": "Malloc disk", 00:20:42.216 "block_size": 512, 00:20:42.216 "num_blocks": 65536, 00:20:42.216 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:42.216 "assigned_rate_limits": { 00:20:42.216 "rw_ios_per_sec": 0, 00:20:42.216 "rw_mbytes_per_sec": 0, 00:20:42.216 "r_mbytes_per_sec": 0, 00:20:42.216 "w_mbytes_per_sec": 0 00:20:42.216 }, 00:20:42.216 "claimed": true, 00:20:42.216 "claim_type": "exclusive_write", 00:20:42.216 "zoned": false, 00:20:42.216 "supported_io_types": { 00:20:42.216 "read": true, 00:20:42.216 "write": true, 00:20:42.216 "unmap": true, 00:20:42.216 "flush": true, 00:20:42.216 "reset": true, 00:20:42.216 "nvme_admin": false, 00:20:42.216 "nvme_io": false, 00:20:42.216 "nvme_io_md": false, 00:20:42.216 "write_zeroes": true, 00:20:42.216 "zcopy": true, 00:20:42.216 "get_zone_info": false, 00:20:42.216 "zone_management": false, 00:20:42.216 "zone_append": false, 00:20:42.216 "compare": false, 00:20:42.216 "compare_and_write": false, 00:20:42.216 "abort": true, 00:20:42.216 "seek_hole": false, 00:20:42.216 "seek_data": false, 00:20:42.216 "copy": true, 00:20:42.216 "nvme_iov_md": false 00:20:42.216 }, 00:20:42.216 "memory_domains": [ 00:20:42.216 { 00:20:42.216 "dma_device_id": "system", 00:20:42.216 "dma_device_type": 1 00:20:42.216 }, 00:20:42.216 { 00:20:42.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.216 "dma_device_type": 2 00:20:42.216 } 00:20:42.216 ], 00:20:42.216 "driver_specific": {} 00:20:42.216 } 00:20:42.216 ] 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.216 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.474 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.474 "name": "Existed_Raid", 00:20:42.474 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:42.474 "strip_size_kb": 64, 00:20:42.474 "state": "configuring", 00:20:42.474 "raid_level": "concat", 00:20:42.474 "superblock": true, 00:20:42.474 "num_base_bdevs": 4, 00:20:42.474 "num_base_bdevs_discovered": 2, 00:20:42.474 "num_base_bdevs_operational": 4, 00:20:42.474 "base_bdevs_list": [ 00:20:42.474 { 00:20:42.474 "name": "BaseBdev1", 00:20:42.474 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:42.474 "is_configured": true, 00:20:42.474 "data_offset": 2048, 00:20:42.474 "data_size": 63488 00:20:42.474 }, 00:20:42.474 { 00:20:42.474 "name": "BaseBdev2", 00:20:42.474 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:42.474 "is_configured": true, 00:20:42.474 "data_offset": 2048, 00:20:42.474 "data_size": 63488 00:20:42.474 }, 00:20:42.474 { 00:20:42.474 "name": "BaseBdev3", 00:20:42.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.474 "is_configured": false, 00:20:42.474 "data_offset": 0, 00:20:42.474 "data_size": 0 00:20:42.474 }, 00:20:42.474 { 00:20:42.474 "name": "BaseBdev4", 00:20:42.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.474 "is_configured": false, 00:20:42.474 "data_offset": 0, 00:20:42.474 "data_size": 0 00:20:42.474 } 00:20:42.474 ] 00:20:42.474 }' 00:20:42.474 22:27:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.474 22:27:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.039 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:43.297 [2024-07-12 22:27:53.390107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:43.297 BaseBdev3 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:43.297 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:43.554 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:43.812 [ 00:20:43.812 { 00:20:43.812 "name": "BaseBdev3", 00:20:43.812 "aliases": [ 00:20:43.812 "16594b26-ce69-43f1-9ec3-dff2a6a8da58" 00:20:43.812 ], 00:20:43.812 "product_name": "Malloc disk", 00:20:43.812 "block_size": 512, 00:20:43.812 "num_blocks": 65536, 00:20:43.812 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:43.812 "assigned_rate_limits": { 00:20:43.812 "rw_ios_per_sec": 0, 00:20:43.812 "rw_mbytes_per_sec": 0, 00:20:43.812 "r_mbytes_per_sec": 0, 00:20:43.812 "w_mbytes_per_sec": 0 00:20:43.812 }, 00:20:43.812 "claimed": true, 00:20:43.812 "claim_type": "exclusive_write", 00:20:43.812 "zoned": false, 00:20:43.812 "supported_io_types": { 00:20:43.812 "read": true, 00:20:43.812 "write": true, 00:20:43.812 "unmap": true, 00:20:43.812 "flush": true, 00:20:43.812 "reset": true, 00:20:43.812 "nvme_admin": false, 00:20:43.812 "nvme_io": false, 00:20:43.812 "nvme_io_md": false, 00:20:43.812 "write_zeroes": true, 00:20:43.812 "zcopy": true, 00:20:43.812 "get_zone_info": false, 00:20:43.812 "zone_management": false, 00:20:43.812 "zone_append": false, 00:20:43.812 "compare": false, 00:20:43.812 "compare_and_write": false, 00:20:43.812 "abort": true, 00:20:43.812 "seek_hole": false, 00:20:43.812 "seek_data": false, 00:20:43.812 "copy": true, 00:20:43.812 "nvme_iov_md": false 00:20:43.812 }, 00:20:43.812 "memory_domains": [ 00:20:43.812 { 00:20:43.812 "dma_device_id": "system", 00:20:43.812 "dma_device_type": 1 00:20:43.812 }, 00:20:43.812 { 00:20:43.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.812 "dma_device_type": 2 00:20:43.812 } 00:20:43.812 ], 00:20:43.812 "driver_specific": {} 00:20:43.812 } 00:20:43.812 ] 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.812 22:27:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.070 22:27:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.070 "name": "Existed_Raid", 00:20:44.070 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:44.070 "strip_size_kb": 64, 00:20:44.070 "state": "configuring", 00:20:44.070 "raid_level": "concat", 00:20:44.070 "superblock": true, 00:20:44.070 "num_base_bdevs": 4, 00:20:44.070 "num_base_bdevs_discovered": 3, 00:20:44.070 "num_base_bdevs_operational": 4, 00:20:44.070 "base_bdevs_list": [ 00:20:44.070 { 00:20:44.070 "name": "BaseBdev1", 00:20:44.070 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:44.070 "is_configured": true, 00:20:44.070 "data_offset": 2048, 00:20:44.070 "data_size": 63488 00:20:44.070 }, 00:20:44.070 { 00:20:44.071 "name": "BaseBdev2", 00:20:44.071 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:44.071 "is_configured": true, 00:20:44.071 "data_offset": 2048, 00:20:44.071 "data_size": 63488 00:20:44.071 }, 00:20:44.071 { 00:20:44.071 "name": "BaseBdev3", 00:20:44.071 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:44.071 "is_configured": true, 00:20:44.071 "data_offset": 2048, 00:20:44.071 "data_size": 63488 00:20:44.071 }, 00:20:44.071 { 00:20:44.071 "name": "BaseBdev4", 00:20:44.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.071 "is_configured": false, 00:20:44.071 "data_offset": 0, 00:20:44.071 "data_size": 0 00:20:44.071 } 00:20:44.071 ] 00:20:44.071 }' 00:20:44.071 22:27:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.071 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:44.637 22:27:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:44.637 [2024-07-12 22:27:54.949727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:44.637 [2024-07-12 22:27:54.949901] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1954350 00:20:44.637 [2024-07-12 22:27:54.949915] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:44.637 [2024-07-12 22:27:54.950101] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1954020 00:20:44.637 [2024-07-12 22:27:54.950221] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1954350 00:20:44.637 [2024-07-12 22:27:54.950231] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1954350 00:20:44.637 [2024-07-12 22:27:54.950321] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.637 BaseBdev4 00:20:44.894 22:27:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:44.894 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:44.894 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:44.894 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:44.894 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:44.895 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:44.895 22:27:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:44.895 22:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:45.152 [ 00:20:45.152 { 00:20:45.152 "name": "BaseBdev4", 00:20:45.152 "aliases": [ 00:20:45.152 "71a518dc-9832-4246-abac-1dd1655a3aa2" 00:20:45.152 ], 00:20:45.152 "product_name": "Malloc disk", 00:20:45.152 "block_size": 512, 00:20:45.152 "num_blocks": 65536, 00:20:45.152 "uuid": "71a518dc-9832-4246-abac-1dd1655a3aa2", 00:20:45.152 "assigned_rate_limits": { 00:20:45.152 "rw_ios_per_sec": 0, 00:20:45.152 "rw_mbytes_per_sec": 0, 00:20:45.152 "r_mbytes_per_sec": 0, 00:20:45.152 "w_mbytes_per_sec": 0 00:20:45.152 }, 00:20:45.152 "claimed": true, 00:20:45.153 "claim_type": "exclusive_write", 00:20:45.153 "zoned": false, 00:20:45.153 "supported_io_types": { 00:20:45.153 "read": true, 00:20:45.153 "write": true, 00:20:45.153 "unmap": true, 00:20:45.153 "flush": true, 00:20:45.153 "reset": true, 00:20:45.153 "nvme_admin": false, 00:20:45.153 "nvme_io": false, 00:20:45.153 "nvme_io_md": false, 00:20:45.153 "write_zeroes": true, 00:20:45.153 "zcopy": true, 00:20:45.153 "get_zone_info": false, 00:20:45.153 "zone_management": false, 00:20:45.153 "zone_append": false, 00:20:45.153 "compare": false, 00:20:45.153 "compare_and_write": false, 00:20:45.153 "abort": true, 00:20:45.153 "seek_hole": false, 00:20:45.153 "seek_data": false, 00:20:45.153 "copy": true, 00:20:45.153 "nvme_iov_md": false 00:20:45.153 }, 00:20:45.153 "memory_domains": [ 00:20:45.153 { 00:20:45.153 "dma_device_id": "system", 00:20:45.153 "dma_device_type": 1 00:20:45.153 }, 00:20:45.153 { 00:20:45.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.153 "dma_device_type": 2 00:20:45.153 } 00:20:45.153 ], 00:20:45.153 "driver_specific": {} 00:20:45.153 } 00:20:45.153 ] 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.153 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.410 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.410 "name": "Existed_Raid", 00:20:45.410 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:45.410 "strip_size_kb": 64, 00:20:45.410 "state": "online", 00:20:45.410 "raid_level": "concat", 00:20:45.410 "superblock": true, 00:20:45.410 "num_base_bdevs": 4, 00:20:45.410 "num_base_bdevs_discovered": 4, 00:20:45.410 "num_base_bdevs_operational": 4, 00:20:45.410 "base_bdevs_list": [ 00:20:45.410 { 00:20:45.410 "name": "BaseBdev1", 00:20:45.410 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:45.410 "is_configured": true, 00:20:45.410 "data_offset": 2048, 00:20:45.410 "data_size": 63488 00:20:45.410 }, 00:20:45.410 { 00:20:45.410 "name": "BaseBdev2", 00:20:45.410 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:45.410 "is_configured": true, 00:20:45.410 "data_offset": 2048, 00:20:45.410 "data_size": 63488 00:20:45.410 }, 00:20:45.410 { 00:20:45.410 "name": "BaseBdev3", 00:20:45.410 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:45.410 "is_configured": true, 00:20:45.410 "data_offset": 2048, 00:20:45.410 "data_size": 63488 00:20:45.410 }, 00:20:45.410 { 00:20:45.410 "name": "BaseBdev4", 00:20:45.410 "uuid": "71a518dc-9832-4246-abac-1dd1655a3aa2", 00:20:45.410 "is_configured": true, 00:20:45.410 "data_offset": 2048, 00:20:45.410 "data_size": 63488 00:20:45.410 } 00:20:45.410 ] 00:20:45.410 }' 00:20:45.410 22:27:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.410 22:27:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:45.976 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:46.234 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:46.234 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:46.234 [2024-07-12 22:27:56.526257] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:46.234 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:46.234 "name": "Existed_Raid", 00:20:46.234 "aliases": [ 00:20:46.234 "8f23affc-e698-4da6-972c-48f5700da075" 00:20:46.234 ], 00:20:46.234 "product_name": "Raid Volume", 00:20:46.234 "block_size": 512, 00:20:46.234 "num_blocks": 253952, 00:20:46.234 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:46.234 "assigned_rate_limits": { 00:20:46.234 "rw_ios_per_sec": 0, 00:20:46.234 "rw_mbytes_per_sec": 0, 00:20:46.234 "r_mbytes_per_sec": 0, 00:20:46.234 "w_mbytes_per_sec": 0 00:20:46.234 }, 00:20:46.234 "claimed": false, 00:20:46.234 "zoned": false, 00:20:46.234 "supported_io_types": { 00:20:46.234 "read": true, 00:20:46.234 "write": true, 00:20:46.234 "unmap": true, 00:20:46.234 "flush": true, 00:20:46.234 "reset": true, 00:20:46.234 "nvme_admin": false, 00:20:46.234 "nvme_io": false, 00:20:46.234 "nvme_io_md": false, 00:20:46.234 "write_zeroes": true, 00:20:46.234 "zcopy": false, 00:20:46.234 "get_zone_info": false, 00:20:46.234 "zone_management": false, 00:20:46.234 "zone_append": false, 00:20:46.234 "compare": false, 00:20:46.234 "compare_and_write": false, 00:20:46.234 "abort": false, 00:20:46.234 "seek_hole": false, 00:20:46.234 "seek_data": false, 00:20:46.234 "copy": false, 00:20:46.234 "nvme_iov_md": false 00:20:46.234 }, 00:20:46.234 "memory_domains": [ 00:20:46.234 { 00:20:46.234 "dma_device_id": "system", 00:20:46.234 "dma_device_type": 1 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.234 "dma_device_type": 2 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "system", 00:20:46.234 "dma_device_type": 1 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.234 "dma_device_type": 2 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "system", 00:20:46.234 "dma_device_type": 1 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.234 "dma_device_type": 2 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "system", 00:20:46.234 "dma_device_type": 1 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.234 "dma_device_type": 2 00:20:46.234 } 00:20:46.234 ], 00:20:46.234 "driver_specific": { 00:20:46.234 "raid": { 00:20:46.234 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:46.234 "strip_size_kb": 64, 00:20:46.234 "state": "online", 00:20:46.234 "raid_level": "concat", 00:20:46.234 "superblock": true, 00:20:46.234 "num_base_bdevs": 4, 00:20:46.234 "num_base_bdevs_discovered": 4, 00:20:46.234 "num_base_bdevs_operational": 4, 00:20:46.234 "base_bdevs_list": [ 00:20:46.234 { 00:20:46.234 "name": "BaseBdev1", 00:20:46.234 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:46.234 "is_configured": true, 00:20:46.234 "data_offset": 2048, 00:20:46.234 "data_size": 63488 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "name": "BaseBdev2", 00:20:46.234 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:46.234 "is_configured": true, 00:20:46.234 "data_offset": 2048, 00:20:46.234 "data_size": 63488 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "name": "BaseBdev3", 00:20:46.234 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:46.234 "is_configured": true, 00:20:46.234 "data_offset": 2048, 00:20:46.234 "data_size": 63488 00:20:46.234 }, 00:20:46.234 { 00:20:46.234 "name": "BaseBdev4", 00:20:46.234 "uuid": "71a518dc-9832-4246-abac-1dd1655a3aa2", 00:20:46.234 "is_configured": true, 00:20:46.234 "data_offset": 2048, 00:20:46.234 "data_size": 63488 00:20:46.234 } 00:20:46.234 ] 00:20:46.234 } 00:20:46.234 } 00:20:46.234 }' 00:20:46.234 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:46.492 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:46.492 BaseBdev2 00:20:46.492 BaseBdev3 00:20:46.492 BaseBdev4' 00:20:46.492 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:46.492 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:46.492 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.750 "name": "BaseBdev1", 00:20:46.750 "aliases": [ 00:20:46.750 "e18245ba-1b85-4914-b593-97794f617647" 00:20:46.750 ], 00:20:46.750 "product_name": "Malloc disk", 00:20:46.750 "block_size": 512, 00:20:46.750 "num_blocks": 65536, 00:20:46.750 "uuid": "e18245ba-1b85-4914-b593-97794f617647", 00:20:46.750 "assigned_rate_limits": { 00:20:46.750 "rw_ios_per_sec": 0, 00:20:46.750 "rw_mbytes_per_sec": 0, 00:20:46.750 "r_mbytes_per_sec": 0, 00:20:46.750 "w_mbytes_per_sec": 0 00:20:46.750 }, 00:20:46.750 "claimed": true, 00:20:46.750 "claim_type": "exclusive_write", 00:20:46.750 "zoned": false, 00:20:46.750 "supported_io_types": { 00:20:46.750 "read": true, 00:20:46.750 "write": true, 00:20:46.750 "unmap": true, 00:20:46.750 "flush": true, 00:20:46.750 "reset": true, 00:20:46.750 "nvme_admin": false, 00:20:46.750 "nvme_io": false, 00:20:46.750 "nvme_io_md": false, 00:20:46.750 "write_zeroes": true, 00:20:46.750 "zcopy": true, 00:20:46.750 "get_zone_info": false, 00:20:46.750 "zone_management": false, 00:20:46.750 "zone_append": false, 00:20:46.750 "compare": false, 00:20:46.750 "compare_and_write": false, 00:20:46.750 "abort": true, 00:20:46.750 "seek_hole": false, 00:20:46.750 "seek_data": false, 00:20:46.750 "copy": true, 00:20:46.750 "nvme_iov_md": false 00:20:46.750 }, 00:20:46.750 "memory_domains": [ 00:20:46.750 { 00:20:46.750 "dma_device_id": "system", 00:20:46.750 "dma_device_type": 1 00:20:46.750 }, 00:20:46.750 { 00:20:46.750 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.750 "dma_device_type": 2 00:20:46.750 } 00:20:46.750 ], 00:20:46.750 "driver_specific": {} 00:20:46.750 }' 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.750 22:27:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.750 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.750 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.750 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:47.008 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.267 "name": "BaseBdev2", 00:20:47.267 "aliases": [ 00:20:47.267 "e3b14892-813a-4084-aabb-549f6402454d" 00:20:47.267 ], 00:20:47.267 "product_name": "Malloc disk", 00:20:47.267 "block_size": 512, 00:20:47.267 "num_blocks": 65536, 00:20:47.267 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:47.267 "assigned_rate_limits": { 00:20:47.267 "rw_ios_per_sec": 0, 00:20:47.267 "rw_mbytes_per_sec": 0, 00:20:47.267 "r_mbytes_per_sec": 0, 00:20:47.267 "w_mbytes_per_sec": 0 00:20:47.267 }, 00:20:47.267 "claimed": true, 00:20:47.267 "claim_type": "exclusive_write", 00:20:47.267 "zoned": false, 00:20:47.267 "supported_io_types": { 00:20:47.267 "read": true, 00:20:47.267 "write": true, 00:20:47.267 "unmap": true, 00:20:47.267 "flush": true, 00:20:47.267 "reset": true, 00:20:47.267 "nvme_admin": false, 00:20:47.267 "nvme_io": false, 00:20:47.267 "nvme_io_md": false, 00:20:47.267 "write_zeroes": true, 00:20:47.267 "zcopy": true, 00:20:47.267 "get_zone_info": false, 00:20:47.267 "zone_management": false, 00:20:47.267 "zone_append": false, 00:20:47.267 "compare": false, 00:20:47.267 "compare_and_write": false, 00:20:47.267 "abort": true, 00:20:47.267 "seek_hole": false, 00:20:47.267 "seek_data": false, 00:20:47.267 "copy": true, 00:20:47.267 "nvme_iov_md": false 00:20:47.267 }, 00:20:47.267 "memory_domains": [ 00:20:47.267 { 00:20:47.267 "dma_device_id": "system", 00:20:47.267 "dma_device_type": 1 00:20:47.267 }, 00:20:47.267 { 00:20:47.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.267 "dma_device_type": 2 00:20:47.267 } 00:20:47.267 ], 00:20:47.267 "driver_specific": {} 00:20:47.267 }' 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.267 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:47.525 22:27:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.783 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.783 "name": "BaseBdev3", 00:20:47.783 "aliases": [ 00:20:47.783 "16594b26-ce69-43f1-9ec3-dff2a6a8da58" 00:20:47.783 ], 00:20:47.783 "product_name": "Malloc disk", 00:20:47.783 "block_size": 512, 00:20:47.783 "num_blocks": 65536, 00:20:47.783 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:47.783 "assigned_rate_limits": { 00:20:47.783 "rw_ios_per_sec": 0, 00:20:47.783 "rw_mbytes_per_sec": 0, 00:20:47.783 "r_mbytes_per_sec": 0, 00:20:47.783 "w_mbytes_per_sec": 0 00:20:47.783 }, 00:20:47.783 "claimed": true, 00:20:47.783 "claim_type": "exclusive_write", 00:20:47.783 "zoned": false, 00:20:47.783 "supported_io_types": { 00:20:47.783 "read": true, 00:20:47.783 "write": true, 00:20:47.783 "unmap": true, 00:20:47.783 "flush": true, 00:20:47.783 "reset": true, 00:20:47.783 "nvme_admin": false, 00:20:47.783 "nvme_io": false, 00:20:47.783 "nvme_io_md": false, 00:20:47.783 "write_zeroes": true, 00:20:47.783 "zcopy": true, 00:20:47.783 "get_zone_info": false, 00:20:47.783 "zone_management": false, 00:20:47.783 "zone_append": false, 00:20:47.783 "compare": false, 00:20:47.783 "compare_and_write": false, 00:20:47.783 "abort": true, 00:20:47.783 "seek_hole": false, 00:20:47.783 "seek_data": false, 00:20:47.783 "copy": true, 00:20:47.783 "nvme_iov_md": false 00:20:47.783 }, 00:20:47.783 "memory_domains": [ 00:20:47.783 { 00:20:47.783 "dma_device_id": "system", 00:20:47.783 "dma_device_type": 1 00:20:47.783 }, 00:20:47.783 { 00:20:47.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.783 "dma_device_type": 2 00:20:47.783 } 00:20:47.783 ], 00:20:47.783 "driver_specific": {} 00:20:47.783 }' 00:20:47.783 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:48.041 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.299 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.299 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:48.299 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:48.299 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:48.299 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:48.557 "name": "BaseBdev4", 00:20:48.557 "aliases": [ 00:20:48.557 "71a518dc-9832-4246-abac-1dd1655a3aa2" 00:20:48.557 ], 00:20:48.557 "product_name": "Malloc disk", 00:20:48.557 "block_size": 512, 00:20:48.557 "num_blocks": 65536, 00:20:48.557 "uuid": "71a518dc-9832-4246-abac-1dd1655a3aa2", 00:20:48.557 "assigned_rate_limits": { 00:20:48.557 "rw_ios_per_sec": 0, 00:20:48.557 "rw_mbytes_per_sec": 0, 00:20:48.557 "r_mbytes_per_sec": 0, 00:20:48.557 "w_mbytes_per_sec": 0 00:20:48.557 }, 00:20:48.557 "claimed": true, 00:20:48.557 "claim_type": "exclusive_write", 00:20:48.557 "zoned": false, 00:20:48.557 "supported_io_types": { 00:20:48.557 "read": true, 00:20:48.557 "write": true, 00:20:48.557 "unmap": true, 00:20:48.557 "flush": true, 00:20:48.557 "reset": true, 00:20:48.557 "nvme_admin": false, 00:20:48.557 "nvme_io": false, 00:20:48.557 "nvme_io_md": false, 00:20:48.557 "write_zeroes": true, 00:20:48.557 "zcopy": true, 00:20:48.557 "get_zone_info": false, 00:20:48.557 "zone_management": false, 00:20:48.557 "zone_append": false, 00:20:48.557 "compare": false, 00:20:48.557 "compare_and_write": false, 00:20:48.557 "abort": true, 00:20:48.557 "seek_hole": false, 00:20:48.557 "seek_data": false, 00:20:48.557 "copy": true, 00:20:48.557 "nvme_iov_md": false 00:20:48.557 }, 00:20:48.557 "memory_domains": [ 00:20:48.557 { 00:20:48.557 "dma_device_id": "system", 00:20:48.557 "dma_device_type": 1 00:20:48.557 }, 00:20:48.557 { 00:20:48.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.557 "dma_device_type": 2 00:20:48.557 } 00:20:48.557 ], 00:20:48.557 "driver_specific": {} 00:20:48.557 }' 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.557 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.558 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.816 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.816 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:48.816 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.816 22:27:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.816 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:48.816 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:49.075 [2024-07-12 22:27:59.233174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:49.075 [2024-07-12 22:27:59.233202] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:49.075 [2024-07-12 22:27:59.233250] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.075 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.332 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.332 "name": "Existed_Raid", 00:20:49.332 "uuid": "8f23affc-e698-4da6-972c-48f5700da075", 00:20:49.332 "strip_size_kb": 64, 00:20:49.332 "state": "offline", 00:20:49.332 "raid_level": "concat", 00:20:49.332 "superblock": true, 00:20:49.332 "num_base_bdevs": 4, 00:20:49.332 "num_base_bdevs_discovered": 3, 00:20:49.332 "num_base_bdevs_operational": 3, 00:20:49.332 "base_bdevs_list": [ 00:20:49.332 { 00:20:49.333 "name": null, 00:20:49.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.333 "is_configured": false, 00:20:49.333 "data_offset": 2048, 00:20:49.333 "data_size": 63488 00:20:49.333 }, 00:20:49.333 { 00:20:49.333 "name": "BaseBdev2", 00:20:49.333 "uuid": "e3b14892-813a-4084-aabb-549f6402454d", 00:20:49.333 "is_configured": true, 00:20:49.333 "data_offset": 2048, 00:20:49.333 "data_size": 63488 00:20:49.333 }, 00:20:49.333 { 00:20:49.333 "name": "BaseBdev3", 00:20:49.333 "uuid": "16594b26-ce69-43f1-9ec3-dff2a6a8da58", 00:20:49.333 "is_configured": true, 00:20:49.333 "data_offset": 2048, 00:20:49.333 "data_size": 63488 00:20:49.333 }, 00:20:49.333 { 00:20:49.333 "name": "BaseBdev4", 00:20:49.333 "uuid": "71a518dc-9832-4246-abac-1dd1655a3aa2", 00:20:49.333 "is_configured": true, 00:20:49.333 "data_offset": 2048, 00:20:49.333 "data_size": 63488 00:20:49.333 } 00:20:49.333 ] 00:20:49.333 }' 00:20:49.333 22:27:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.333 22:27:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.897 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:49.897 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.897 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.897 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.154 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:50.154 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:50.155 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:50.412 [2024-07-12 22:28:00.578680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:50.412 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:50.412 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:50.412 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.412 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:50.670 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:50.670 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:50.670 22:28:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:50.928 [2024-07-12 22:28:01.084560] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:50.928 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:50.928 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:50.928 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:50.928 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.185 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:51.185 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:51.185 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:51.443 [2024-07-12 22:28:01.594615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:51.443 [2024-07-12 22:28:01.594658] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1954350 name Existed_Raid, state offline 00:20:51.443 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:51.443 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:51.443 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:51.443 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:51.700 22:28:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:52.023 BaseBdev2 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.023 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.281 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:52.281 [ 00:20:52.281 { 00:20:52.281 "name": "BaseBdev2", 00:20:52.281 "aliases": [ 00:20:52.281 "9cdfba0e-b58c-477d-982a-7fb8589fa35f" 00:20:52.281 ], 00:20:52.281 "product_name": "Malloc disk", 00:20:52.281 "block_size": 512, 00:20:52.281 "num_blocks": 65536, 00:20:52.281 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:52.281 "assigned_rate_limits": { 00:20:52.281 "rw_ios_per_sec": 0, 00:20:52.281 "rw_mbytes_per_sec": 0, 00:20:52.281 "r_mbytes_per_sec": 0, 00:20:52.281 "w_mbytes_per_sec": 0 00:20:52.281 }, 00:20:52.281 "claimed": false, 00:20:52.281 "zoned": false, 00:20:52.281 "supported_io_types": { 00:20:52.281 "read": true, 00:20:52.281 "write": true, 00:20:52.281 "unmap": true, 00:20:52.281 "flush": true, 00:20:52.281 "reset": true, 00:20:52.281 "nvme_admin": false, 00:20:52.281 "nvme_io": false, 00:20:52.281 "nvme_io_md": false, 00:20:52.281 "write_zeroes": true, 00:20:52.281 "zcopy": true, 00:20:52.281 "get_zone_info": false, 00:20:52.281 "zone_management": false, 00:20:52.281 "zone_append": false, 00:20:52.281 "compare": false, 00:20:52.281 "compare_and_write": false, 00:20:52.281 "abort": true, 00:20:52.281 "seek_hole": false, 00:20:52.281 "seek_data": false, 00:20:52.281 "copy": true, 00:20:52.281 "nvme_iov_md": false 00:20:52.281 }, 00:20:52.281 "memory_domains": [ 00:20:52.281 { 00:20:52.281 "dma_device_id": "system", 00:20:52.281 "dma_device_type": 1 00:20:52.281 }, 00:20:52.281 { 00:20:52.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.281 "dma_device_type": 2 00:20:52.281 } 00:20:52.281 ], 00:20:52.281 "driver_specific": {} 00:20:52.281 } 00:20:52.281 ] 00:20:52.281 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:52.281 22:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:52.281 22:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:52.281 22:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:52.538 BaseBdev3 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.796 22:28:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.796 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:53.054 [ 00:20:53.054 { 00:20:53.054 "name": "BaseBdev3", 00:20:53.054 "aliases": [ 00:20:53.054 "7f7744e4-c2ab-488c-a222-d437f65462f3" 00:20:53.054 ], 00:20:53.054 "product_name": "Malloc disk", 00:20:53.054 "block_size": 512, 00:20:53.054 "num_blocks": 65536, 00:20:53.054 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:53.054 "assigned_rate_limits": { 00:20:53.054 "rw_ios_per_sec": 0, 00:20:53.054 "rw_mbytes_per_sec": 0, 00:20:53.054 "r_mbytes_per_sec": 0, 00:20:53.054 "w_mbytes_per_sec": 0 00:20:53.054 }, 00:20:53.054 "claimed": false, 00:20:53.054 "zoned": false, 00:20:53.054 "supported_io_types": { 00:20:53.054 "read": true, 00:20:53.054 "write": true, 00:20:53.054 "unmap": true, 00:20:53.054 "flush": true, 00:20:53.054 "reset": true, 00:20:53.054 "nvme_admin": false, 00:20:53.054 "nvme_io": false, 00:20:53.054 "nvme_io_md": false, 00:20:53.054 "write_zeroes": true, 00:20:53.054 "zcopy": true, 00:20:53.054 "get_zone_info": false, 00:20:53.054 "zone_management": false, 00:20:53.054 "zone_append": false, 00:20:53.054 "compare": false, 00:20:53.054 "compare_and_write": false, 00:20:53.054 "abort": true, 00:20:53.054 "seek_hole": false, 00:20:53.054 "seek_data": false, 00:20:53.054 "copy": true, 00:20:53.054 "nvme_iov_md": false 00:20:53.054 }, 00:20:53.054 "memory_domains": [ 00:20:53.054 { 00:20:53.054 "dma_device_id": "system", 00:20:53.054 "dma_device_type": 1 00:20:53.054 }, 00:20:53.054 { 00:20:53.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.054 "dma_device_type": 2 00:20:53.054 } 00:20:53.054 ], 00:20:53.054 "driver_specific": {} 00:20:53.054 } 00:20:53.054 ] 00:20:53.054 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:53.054 22:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:53.054 22:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:53.054 22:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:53.312 BaseBdev4 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:53.312 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.570 22:28:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:53.827 [ 00:20:53.827 { 00:20:53.827 "name": "BaseBdev4", 00:20:53.827 "aliases": [ 00:20:53.827 "60d93b99-1a6e-490e-913b-c94d3b8fb810" 00:20:53.827 ], 00:20:53.827 "product_name": "Malloc disk", 00:20:53.827 "block_size": 512, 00:20:53.827 "num_blocks": 65536, 00:20:53.827 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:53.827 "assigned_rate_limits": { 00:20:53.827 "rw_ios_per_sec": 0, 00:20:53.827 "rw_mbytes_per_sec": 0, 00:20:53.827 "r_mbytes_per_sec": 0, 00:20:53.827 "w_mbytes_per_sec": 0 00:20:53.827 }, 00:20:53.827 "claimed": false, 00:20:53.827 "zoned": false, 00:20:53.827 "supported_io_types": { 00:20:53.827 "read": true, 00:20:53.827 "write": true, 00:20:53.827 "unmap": true, 00:20:53.827 "flush": true, 00:20:53.827 "reset": true, 00:20:53.827 "nvme_admin": false, 00:20:53.827 "nvme_io": false, 00:20:53.827 "nvme_io_md": false, 00:20:53.827 "write_zeroes": true, 00:20:53.827 "zcopy": true, 00:20:53.827 "get_zone_info": false, 00:20:53.827 "zone_management": false, 00:20:53.827 "zone_append": false, 00:20:53.827 "compare": false, 00:20:53.827 "compare_and_write": false, 00:20:53.827 "abort": true, 00:20:53.827 "seek_hole": false, 00:20:53.827 "seek_data": false, 00:20:53.827 "copy": true, 00:20:53.827 "nvme_iov_md": false 00:20:53.827 }, 00:20:53.827 "memory_domains": [ 00:20:53.827 { 00:20:53.827 "dma_device_id": "system", 00:20:53.827 "dma_device_type": 1 00:20:53.827 }, 00:20:53.827 { 00:20:53.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.827 "dma_device_type": 2 00:20:53.827 } 00:20:53.827 ], 00:20:53.827 "driver_specific": {} 00:20:53.827 } 00:20:53.827 ] 00:20:53.827 22:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:53.827 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:53.827 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:53.827 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:54.085 [2024-07-12 22:28:04.326123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:54.086 [2024-07-12 22:28:04.326164] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:54.086 [2024-07-12 22:28:04.326185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:54.086 [2024-07-12 22:28:04.327556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:54.086 [2024-07-12 22:28:04.327600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.086 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.344 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.344 "name": "Existed_Raid", 00:20:54.344 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:20:54.344 "strip_size_kb": 64, 00:20:54.344 "state": "configuring", 00:20:54.344 "raid_level": "concat", 00:20:54.344 "superblock": true, 00:20:54.344 "num_base_bdevs": 4, 00:20:54.344 "num_base_bdevs_discovered": 3, 00:20:54.344 "num_base_bdevs_operational": 4, 00:20:54.344 "base_bdevs_list": [ 00:20:54.344 { 00:20:54.344 "name": "BaseBdev1", 00:20:54.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.344 "is_configured": false, 00:20:54.344 "data_offset": 0, 00:20:54.344 "data_size": 0 00:20:54.344 }, 00:20:54.344 { 00:20:54.344 "name": "BaseBdev2", 00:20:54.344 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:54.344 "is_configured": true, 00:20:54.344 "data_offset": 2048, 00:20:54.344 "data_size": 63488 00:20:54.344 }, 00:20:54.344 { 00:20:54.344 "name": "BaseBdev3", 00:20:54.344 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:54.344 "is_configured": true, 00:20:54.344 "data_offset": 2048, 00:20:54.344 "data_size": 63488 00:20:54.344 }, 00:20:54.344 { 00:20:54.344 "name": "BaseBdev4", 00:20:54.344 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:54.344 "is_configured": true, 00:20:54.344 "data_offset": 2048, 00:20:54.344 "data_size": 63488 00:20:54.344 } 00:20:54.344 ] 00:20:54.344 }' 00:20:54.344 22:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.344 22:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:54.910 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:55.169 [2024-07-12 22:28:05.396954] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.169 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.427 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.427 "name": "Existed_Raid", 00:20:55.427 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:20:55.427 "strip_size_kb": 64, 00:20:55.427 "state": "configuring", 00:20:55.427 "raid_level": "concat", 00:20:55.427 "superblock": true, 00:20:55.427 "num_base_bdevs": 4, 00:20:55.427 "num_base_bdevs_discovered": 2, 00:20:55.427 "num_base_bdevs_operational": 4, 00:20:55.427 "base_bdevs_list": [ 00:20:55.427 { 00:20:55.427 "name": "BaseBdev1", 00:20:55.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.427 "is_configured": false, 00:20:55.427 "data_offset": 0, 00:20:55.427 "data_size": 0 00:20:55.427 }, 00:20:55.427 { 00:20:55.427 "name": null, 00:20:55.427 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:55.427 "is_configured": false, 00:20:55.427 "data_offset": 2048, 00:20:55.427 "data_size": 63488 00:20:55.427 }, 00:20:55.427 { 00:20:55.427 "name": "BaseBdev3", 00:20:55.427 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:55.427 "is_configured": true, 00:20:55.427 "data_offset": 2048, 00:20:55.427 "data_size": 63488 00:20:55.427 }, 00:20:55.427 { 00:20:55.427 "name": "BaseBdev4", 00:20:55.427 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:55.427 "is_configured": true, 00:20:55.427 "data_offset": 2048, 00:20:55.427 "data_size": 63488 00:20:55.427 } 00:20:55.427 ] 00:20:55.427 }' 00:20:55.427 22:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.427 22:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:55.994 22:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.994 22:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:56.252 22:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:56.252 22:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:56.509 [2024-07-12 22:28:06.731983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:56.509 BaseBdev1 00:20:56.509 22:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:56.509 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:56.509 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:56.509 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:56.510 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:56.510 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:56.510 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.767 22:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:57.026 [ 00:20:57.026 { 00:20:57.026 "name": "BaseBdev1", 00:20:57.026 "aliases": [ 00:20:57.026 "74b9bcfc-b211-4b7c-8d88-c416e85bfe57" 00:20:57.026 ], 00:20:57.026 "product_name": "Malloc disk", 00:20:57.026 "block_size": 512, 00:20:57.026 "num_blocks": 65536, 00:20:57.026 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:20:57.026 "assigned_rate_limits": { 00:20:57.026 "rw_ios_per_sec": 0, 00:20:57.026 "rw_mbytes_per_sec": 0, 00:20:57.026 "r_mbytes_per_sec": 0, 00:20:57.026 "w_mbytes_per_sec": 0 00:20:57.026 }, 00:20:57.026 "claimed": true, 00:20:57.026 "claim_type": "exclusive_write", 00:20:57.026 "zoned": false, 00:20:57.026 "supported_io_types": { 00:20:57.026 "read": true, 00:20:57.026 "write": true, 00:20:57.026 "unmap": true, 00:20:57.026 "flush": true, 00:20:57.026 "reset": true, 00:20:57.026 "nvme_admin": false, 00:20:57.026 "nvme_io": false, 00:20:57.026 "nvme_io_md": false, 00:20:57.026 "write_zeroes": true, 00:20:57.026 "zcopy": true, 00:20:57.026 "get_zone_info": false, 00:20:57.026 "zone_management": false, 00:20:57.026 "zone_append": false, 00:20:57.026 "compare": false, 00:20:57.026 "compare_and_write": false, 00:20:57.026 "abort": true, 00:20:57.026 "seek_hole": false, 00:20:57.026 "seek_data": false, 00:20:57.026 "copy": true, 00:20:57.026 "nvme_iov_md": false 00:20:57.026 }, 00:20:57.026 "memory_domains": [ 00:20:57.026 { 00:20:57.026 "dma_device_id": "system", 00:20:57.026 "dma_device_type": 1 00:20:57.026 }, 00:20:57.026 { 00:20:57.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:57.026 "dma_device_type": 2 00:20:57.026 } 00:20:57.026 ], 00:20:57.026 "driver_specific": {} 00:20:57.026 } 00:20:57.026 ] 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.026 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.285 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.285 "name": "Existed_Raid", 00:20:57.285 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:20:57.285 "strip_size_kb": 64, 00:20:57.285 "state": "configuring", 00:20:57.285 "raid_level": "concat", 00:20:57.285 "superblock": true, 00:20:57.285 "num_base_bdevs": 4, 00:20:57.285 "num_base_bdevs_discovered": 3, 00:20:57.285 "num_base_bdevs_operational": 4, 00:20:57.285 "base_bdevs_list": [ 00:20:57.285 { 00:20:57.285 "name": "BaseBdev1", 00:20:57.285 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:20:57.285 "is_configured": true, 00:20:57.285 "data_offset": 2048, 00:20:57.285 "data_size": 63488 00:20:57.285 }, 00:20:57.285 { 00:20:57.285 "name": null, 00:20:57.285 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:57.285 "is_configured": false, 00:20:57.285 "data_offset": 2048, 00:20:57.285 "data_size": 63488 00:20:57.285 }, 00:20:57.285 { 00:20:57.285 "name": "BaseBdev3", 00:20:57.285 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:57.285 "is_configured": true, 00:20:57.285 "data_offset": 2048, 00:20:57.285 "data_size": 63488 00:20:57.285 }, 00:20:57.285 { 00:20:57.285 "name": "BaseBdev4", 00:20:57.285 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:57.285 "is_configured": true, 00:20:57.285 "data_offset": 2048, 00:20:57.285 "data_size": 63488 00:20:57.285 } 00:20:57.285 ] 00:20:57.285 }' 00:20:57.285 22:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.285 22:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.851 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.851 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:58.109 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:58.110 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:58.368 [2024-07-12 22:28:08.448549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.368 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.626 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.626 "name": "Existed_Raid", 00:20:58.626 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:20:58.626 "strip_size_kb": 64, 00:20:58.626 "state": "configuring", 00:20:58.626 "raid_level": "concat", 00:20:58.626 "superblock": true, 00:20:58.626 "num_base_bdevs": 4, 00:20:58.626 "num_base_bdevs_discovered": 2, 00:20:58.626 "num_base_bdevs_operational": 4, 00:20:58.626 "base_bdevs_list": [ 00:20:58.626 { 00:20:58.626 "name": "BaseBdev1", 00:20:58.626 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:20:58.626 "is_configured": true, 00:20:58.626 "data_offset": 2048, 00:20:58.626 "data_size": 63488 00:20:58.626 }, 00:20:58.626 { 00:20:58.626 "name": null, 00:20:58.626 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:58.626 "is_configured": false, 00:20:58.626 "data_offset": 2048, 00:20:58.626 "data_size": 63488 00:20:58.626 }, 00:20:58.626 { 00:20:58.626 "name": null, 00:20:58.626 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:58.626 "is_configured": false, 00:20:58.626 "data_offset": 2048, 00:20:58.626 "data_size": 63488 00:20:58.626 }, 00:20:58.626 { 00:20:58.626 "name": "BaseBdev4", 00:20:58.626 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:58.626 "is_configured": true, 00:20:58.626 "data_offset": 2048, 00:20:58.626 "data_size": 63488 00:20:58.626 } 00:20:58.626 ] 00:20:58.626 }' 00:20:58.626 22:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.626 22:28:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:59.192 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.192 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:59.451 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:59.451 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:59.451 [2024-07-12 22:28:09.776105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.709 22:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.967 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.967 "name": "Existed_Raid", 00:20:59.967 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:20:59.967 "strip_size_kb": 64, 00:20:59.967 "state": "configuring", 00:20:59.967 "raid_level": "concat", 00:20:59.967 "superblock": true, 00:20:59.967 "num_base_bdevs": 4, 00:20:59.967 "num_base_bdevs_discovered": 3, 00:20:59.967 "num_base_bdevs_operational": 4, 00:20:59.968 "base_bdevs_list": [ 00:20:59.968 { 00:20:59.968 "name": "BaseBdev1", 00:20:59.968 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:20:59.968 "is_configured": true, 00:20:59.968 "data_offset": 2048, 00:20:59.968 "data_size": 63488 00:20:59.968 }, 00:20:59.968 { 00:20:59.968 "name": null, 00:20:59.968 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:20:59.968 "is_configured": false, 00:20:59.968 "data_offset": 2048, 00:20:59.968 "data_size": 63488 00:20:59.968 }, 00:20:59.968 { 00:20:59.968 "name": "BaseBdev3", 00:20:59.968 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:20:59.968 "is_configured": true, 00:20:59.968 "data_offset": 2048, 00:20:59.968 "data_size": 63488 00:20:59.968 }, 00:20:59.968 { 00:20:59.968 "name": "BaseBdev4", 00:20:59.968 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:20:59.968 "is_configured": true, 00:20:59.968 "data_offset": 2048, 00:20:59.968 "data_size": 63488 00:20:59.968 } 00:20:59.968 ] 00:20:59.968 }' 00:20:59.968 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.968 22:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.534 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:00.534 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.792 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:00.792 22:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:00.792 [2024-07-12 22:28:11.091626] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.051 "name": "Existed_Raid", 00:21:01.051 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:21:01.051 "strip_size_kb": 64, 00:21:01.051 "state": "configuring", 00:21:01.051 "raid_level": "concat", 00:21:01.051 "superblock": true, 00:21:01.051 "num_base_bdevs": 4, 00:21:01.051 "num_base_bdevs_discovered": 2, 00:21:01.051 "num_base_bdevs_operational": 4, 00:21:01.051 "base_bdevs_list": [ 00:21:01.051 { 00:21:01.051 "name": null, 00:21:01.051 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:01.051 "is_configured": false, 00:21:01.051 "data_offset": 2048, 00:21:01.051 "data_size": 63488 00:21:01.051 }, 00:21:01.051 { 00:21:01.051 "name": null, 00:21:01.051 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:21:01.051 "is_configured": false, 00:21:01.051 "data_offset": 2048, 00:21:01.051 "data_size": 63488 00:21:01.051 }, 00:21:01.051 { 00:21:01.051 "name": "BaseBdev3", 00:21:01.051 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:21:01.051 "is_configured": true, 00:21:01.051 "data_offset": 2048, 00:21:01.051 "data_size": 63488 00:21:01.051 }, 00:21:01.051 { 00:21:01.051 "name": "BaseBdev4", 00:21:01.051 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:21:01.051 "is_configured": true, 00:21:01.051 "data_offset": 2048, 00:21:01.051 "data_size": 63488 00:21:01.051 } 00:21:01.051 ] 00:21:01.051 }' 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.051 22:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.983 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.983 22:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:01.983 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:01.983 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:02.240 [2024-07-12 22:28:12.447707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.240 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.499 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.499 "name": "Existed_Raid", 00:21:02.499 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:21:02.499 "strip_size_kb": 64, 00:21:02.499 "state": "configuring", 00:21:02.499 "raid_level": "concat", 00:21:02.499 "superblock": true, 00:21:02.499 "num_base_bdevs": 4, 00:21:02.499 "num_base_bdevs_discovered": 3, 00:21:02.499 "num_base_bdevs_operational": 4, 00:21:02.499 "base_bdevs_list": [ 00:21:02.499 { 00:21:02.499 "name": null, 00:21:02.499 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:02.499 "is_configured": false, 00:21:02.499 "data_offset": 2048, 00:21:02.499 "data_size": 63488 00:21:02.499 }, 00:21:02.499 { 00:21:02.499 "name": "BaseBdev2", 00:21:02.499 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:21:02.499 "is_configured": true, 00:21:02.499 "data_offset": 2048, 00:21:02.499 "data_size": 63488 00:21:02.499 }, 00:21:02.499 { 00:21:02.499 "name": "BaseBdev3", 00:21:02.499 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:21:02.499 "is_configured": true, 00:21:02.499 "data_offset": 2048, 00:21:02.499 "data_size": 63488 00:21:02.499 }, 00:21:02.499 { 00:21:02.499 "name": "BaseBdev4", 00:21:02.499 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:21:02.499 "is_configured": true, 00:21:02.499 "data_offset": 2048, 00:21:02.499 "data_size": 63488 00:21:02.499 } 00:21:02.499 ] 00:21:02.499 }' 00:21:02.499 22:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.499 22:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:03.061 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.061 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:03.318 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:03.318 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.318 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:03.576 22:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 74b9bcfc-b211-4b7c-8d88-c416e85bfe57 00:21:03.834 [2024-07-12 22:28:14.023262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:03.834 [2024-07-12 22:28:14.023422] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1956850 00:21:03.834 [2024-07-12 22:28:14.023435] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:03.834 [2024-07-12 22:28:14.023613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x194cd80 00:21:03.834 [2024-07-12 22:28:14.023727] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1956850 00:21:03.834 [2024-07-12 22:28:14.023737] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1956850 00:21:03.834 [2024-07-12 22:28:14.023827] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:03.834 NewBaseBdev 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:03.834 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.092 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:04.350 [ 00:21:04.350 { 00:21:04.350 "name": "NewBaseBdev", 00:21:04.350 "aliases": [ 00:21:04.350 "74b9bcfc-b211-4b7c-8d88-c416e85bfe57" 00:21:04.350 ], 00:21:04.350 "product_name": "Malloc disk", 00:21:04.350 "block_size": 512, 00:21:04.350 "num_blocks": 65536, 00:21:04.350 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:04.350 "assigned_rate_limits": { 00:21:04.350 "rw_ios_per_sec": 0, 00:21:04.350 "rw_mbytes_per_sec": 0, 00:21:04.350 "r_mbytes_per_sec": 0, 00:21:04.350 "w_mbytes_per_sec": 0 00:21:04.350 }, 00:21:04.350 "claimed": true, 00:21:04.350 "claim_type": "exclusive_write", 00:21:04.350 "zoned": false, 00:21:04.350 "supported_io_types": { 00:21:04.350 "read": true, 00:21:04.350 "write": true, 00:21:04.350 "unmap": true, 00:21:04.350 "flush": true, 00:21:04.350 "reset": true, 00:21:04.350 "nvme_admin": false, 00:21:04.350 "nvme_io": false, 00:21:04.350 "nvme_io_md": false, 00:21:04.350 "write_zeroes": true, 00:21:04.350 "zcopy": true, 00:21:04.350 "get_zone_info": false, 00:21:04.350 "zone_management": false, 00:21:04.350 "zone_append": false, 00:21:04.350 "compare": false, 00:21:04.350 "compare_and_write": false, 00:21:04.351 "abort": true, 00:21:04.351 "seek_hole": false, 00:21:04.351 "seek_data": false, 00:21:04.351 "copy": true, 00:21:04.351 "nvme_iov_md": false 00:21:04.351 }, 00:21:04.351 "memory_domains": [ 00:21:04.351 { 00:21:04.351 "dma_device_id": "system", 00:21:04.351 "dma_device_type": 1 00:21:04.351 }, 00:21:04.351 { 00:21:04.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.351 "dma_device_type": 2 00:21:04.351 } 00:21:04.351 ], 00:21:04.351 "driver_specific": {} 00:21:04.351 } 00:21:04.351 ] 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.351 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.610 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.610 "name": "Existed_Raid", 00:21:04.610 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:21:04.610 "strip_size_kb": 64, 00:21:04.610 "state": "online", 00:21:04.610 "raid_level": "concat", 00:21:04.610 "superblock": true, 00:21:04.610 "num_base_bdevs": 4, 00:21:04.610 "num_base_bdevs_discovered": 4, 00:21:04.610 "num_base_bdevs_operational": 4, 00:21:04.610 "base_bdevs_list": [ 00:21:04.610 { 00:21:04.610 "name": "NewBaseBdev", 00:21:04.610 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:04.610 "is_configured": true, 00:21:04.610 "data_offset": 2048, 00:21:04.610 "data_size": 63488 00:21:04.610 }, 00:21:04.610 { 00:21:04.610 "name": "BaseBdev2", 00:21:04.610 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:21:04.610 "is_configured": true, 00:21:04.610 "data_offset": 2048, 00:21:04.610 "data_size": 63488 00:21:04.610 }, 00:21:04.610 { 00:21:04.610 "name": "BaseBdev3", 00:21:04.610 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:21:04.610 "is_configured": true, 00:21:04.610 "data_offset": 2048, 00:21:04.610 "data_size": 63488 00:21:04.610 }, 00:21:04.610 { 00:21:04.610 "name": "BaseBdev4", 00:21:04.610 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:21:04.610 "is_configured": true, 00:21:04.610 "data_offset": 2048, 00:21:04.610 "data_size": 63488 00:21:04.610 } 00:21:04.610 ] 00:21:04.610 }' 00:21:04.610 22:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.610 22:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:05.176 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:05.434 [2024-07-12 22:28:15.543634] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:05.434 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:05.434 "name": "Existed_Raid", 00:21:05.434 "aliases": [ 00:21:05.434 "6aa5c7df-23a0-460f-89d5-93b800d50477" 00:21:05.434 ], 00:21:05.434 "product_name": "Raid Volume", 00:21:05.434 "block_size": 512, 00:21:05.434 "num_blocks": 253952, 00:21:05.434 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:21:05.434 "assigned_rate_limits": { 00:21:05.434 "rw_ios_per_sec": 0, 00:21:05.434 "rw_mbytes_per_sec": 0, 00:21:05.434 "r_mbytes_per_sec": 0, 00:21:05.434 "w_mbytes_per_sec": 0 00:21:05.434 }, 00:21:05.434 "claimed": false, 00:21:05.434 "zoned": false, 00:21:05.434 "supported_io_types": { 00:21:05.434 "read": true, 00:21:05.434 "write": true, 00:21:05.434 "unmap": true, 00:21:05.434 "flush": true, 00:21:05.434 "reset": true, 00:21:05.434 "nvme_admin": false, 00:21:05.434 "nvme_io": false, 00:21:05.434 "nvme_io_md": false, 00:21:05.434 "write_zeroes": true, 00:21:05.434 "zcopy": false, 00:21:05.434 "get_zone_info": false, 00:21:05.434 "zone_management": false, 00:21:05.434 "zone_append": false, 00:21:05.434 "compare": false, 00:21:05.434 "compare_and_write": false, 00:21:05.434 "abort": false, 00:21:05.434 "seek_hole": false, 00:21:05.434 "seek_data": false, 00:21:05.434 "copy": false, 00:21:05.434 "nvme_iov_md": false 00:21:05.434 }, 00:21:05.435 "memory_domains": [ 00:21:05.435 { 00:21:05.435 "dma_device_id": "system", 00:21:05.435 "dma_device_type": 1 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.435 "dma_device_type": 2 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "system", 00:21:05.435 "dma_device_type": 1 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.435 "dma_device_type": 2 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "system", 00:21:05.435 "dma_device_type": 1 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.435 "dma_device_type": 2 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "system", 00:21:05.435 "dma_device_type": 1 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.435 "dma_device_type": 2 00:21:05.435 } 00:21:05.435 ], 00:21:05.435 "driver_specific": { 00:21:05.435 "raid": { 00:21:05.435 "uuid": "6aa5c7df-23a0-460f-89d5-93b800d50477", 00:21:05.435 "strip_size_kb": 64, 00:21:05.435 "state": "online", 00:21:05.435 "raid_level": "concat", 00:21:05.435 "superblock": true, 00:21:05.435 "num_base_bdevs": 4, 00:21:05.435 "num_base_bdevs_discovered": 4, 00:21:05.435 "num_base_bdevs_operational": 4, 00:21:05.435 "base_bdevs_list": [ 00:21:05.435 { 00:21:05.435 "name": "NewBaseBdev", 00:21:05.435 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:05.435 "is_configured": true, 00:21:05.435 "data_offset": 2048, 00:21:05.435 "data_size": 63488 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "name": "BaseBdev2", 00:21:05.435 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:21:05.435 "is_configured": true, 00:21:05.435 "data_offset": 2048, 00:21:05.435 "data_size": 63488 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "name": "BaseBdev3", 00:21:05.435 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:21:05.435 "is_configured": true, 00:21:05.435 "data_offset": 2048, 00:21:05.435 "data_size": 63488 00:21:05.435 }, 00:21:05.435 { 00:21:05.435 "name": "BaseBdev4", 00:21:05.435 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:21:05.435 "is_configured": true, 00:21:05.435 "data_offset": 2048, 00:21:05.435 "data_size": 63488 00:21:05.435 } 00:21:05.435 ] 00:21:05.435 } 00:21:05.435 } 00:21:05.435 }' 00:21:05.435 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:05.435 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:05.435 BaseBdev2 00:21:05.435 BaseBdev3 00:21:05.435 BaseBdev4' 00:21:05.435 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:05.435 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:05.435 22:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.000 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.000 "name": "NewBaseBdev", 00:21:06.000 "aliases": [ 00:21:06.000 "74b9bcfc-b211-4b7c-8d88-c416e85bfe57" 00:21:06.000 ], 00:21:06.000 "product_name": "Malloc disk", 00:21:06.000 "block_size": 512, 00:21:06.000 "num_blocks": 65536, 00:21:06.000 "uuid": "74b9bcfc-b211-4b7c-8d88-c416e85bfe57", 00:21:06.000 "assigned_rate_limits": { 00:21:06.000 "rw_ios_per_sec": 0, 00:21:06.000 "rw_mbytes_per_sec": 0, 00:21:06.000 "r_mbytes_per_sec": 0, 00:21:06.000 "w_mbytes_per_sec": 0 00:21:06.000 }, 00:21:06.000 "claimed": true, 00:21:06.000 "claim_type": "exclusive_write", 00:21:06.000 "zoned": false, 00:21:06.000 "supported_io_types": { 00:21:06.000 "read": true, 00:21:06.000 "write": true, 00:21:06.000 "unmap": true, 00:21:06.000 "flush": true, 00:21:06.000 "reset": true, 00:21:06.000 "nvme_admin": false, 00:21:06.000 "nvme_io": false, 00:21:06.000 "nvme_io_md": false, 00:21:06.000 "write_zeroes": true, 00:21:06.000 "zcopy": true, 00:21:06.000 "get_zone_info": false, 00:21:06.000 "zone_management": false, 00:21:06.000 "zone_append": false, 00:21:06.000 "compare": false, 00:21:06.000 "compare_and_write": false, 00:21:06.000 "abort": true, 00:21:06.000 "seek_hole": false, 00:21:06.001 "seek_data": false, 00:21:06.001 "copy": true, 00:21:06.001 "nvme_iov_md": false 00:21:06.001 }, 00:21:06.001 "memory_domains": [ 00:21:06.001 { 00:21:06.001 "dma_device_id": "system", 00:21:06.001 "dma_device_type": 1 00:21:06.001 }, 00:21:06.001 { 00:21:06.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.001 "dma_device_type": 2 00:21:06.001 } 00:21:06.001 ], 00:21:06.001 "driver_specific": {} 00:21:06.001 }' 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.001 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:06.258 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.516 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.516 "name": "BaseBdev2", 00:21:06.516 "aliases": [ 00:21:06.516 "9cdfba0e-b58c-477d-982a-7fb8589fa35f" 00:21:06.516 ], 00:21:06.516 "product_name": "Malloc disk", 00:21:06.516 "block_size": 512, 00:21:06.516 "num_blocks": 65536, 00:21:06.516 "uuid": "9cdfba0e-b58c-477d-982a-7fb8589fa35f", 00:21:06.516 "assigned_rate_limits": { 00:21:06.516 "rw_ios_per_sec": 0, 00:21:06.516 "rw_mbytes_per_sec": 0, 00:21:06.516 "r_mbytes_per_sec": 0, 00:21:06.516 "w_mbytes_per_sec": 0 00:21:06.516 }, 00:21:06.516 "claimed": true, 00:21:06.516 "claim_type": "exclusive_write", 00:21:06.516 "zoned": false, 00:21:06.516 "supported_io_types": { 00:21:06.516 "read": true, 00:21:06.516 "write": true, 00:21:06.516 "unmap": true, 00:21:06.516 "flush": true, 00:21:06.516 "reset": true, 00:21:06.516 "nvme_admin": false, 00:21:06.516 "nvme_io": false, 00:21:06.516 "nvme_io_md": false, 00:21:06.516 "write_zeroes": true, 00:21:06.516 "zcopy": true, 00:21:06.516 "get_zone_info": false, 00:21:06.516 "zone_management": false, 00:21:06.516 "zone_append": false, 00:21:06.516 "compare": false, 00:21:06.516 "compare_and_write": false, 00:21:06.516 "abort": true, 00:21:06.516 "seek_hole": false, 00:21:06.516 "seek_data": false, 00:21:06.516 "copy": true, 00:21:06.516 "nvme_iov_md": false 00:21:06.516 }, 00:21:06.516 "memory_domains": [ 00:21:06.516 { 00:21:06.516 "dma_device_id": "system", 00:21:06.516 "dma_device_type": 1 00:21:06.516 }, 00:21:06.516 { 00:21:06.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.516 "dma_device_type": 2 00:21:06.516 } 00:21:06.516 ], 00:21:06.516 "driver_specific": {} 00:21:06.516 }' 00:21:06.516 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.516 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.516 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:06.516 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.774 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.774 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.774 22:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.774 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.774 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.774 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.031 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.031 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.031 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.031 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:07.031 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.289 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.289 "name": "BaseBdev3", 00:21:07.289 "aliases": [ 00:21:07.289 "7f7744e4-c2ab-488c-a222-d437f65462f3" 00:21:07.289 ], 00:21:07.289 "product_name": "Malloc disk", 00:21:07.289 "block_size": 512, 00:21:07.289 "num_blocks": 65536, 00:21:07.289 "uuid": "7f7744e4-c2ab-488c-a222-d437f65462f3", 00:21:07.289 "assigned_rate_limits": { 00:21:07.289 "rw_ios_per_sec": 0, 00:21:07.289 "rw_mbytes_per_sec": 0, 00:21:07.289 "r_mbytes_per_sec": 0, 00:21:07.289 "w_mbytes_per_sec": 0 00:21:07.289 }, 00:21:07.289 "claimed": true, 00:21:07.289 "claim_type": "exclusive_write", 00:21:07.289 "zoned": false, 00:21:07.289 "supported_io_types": { 00:21:07.289 "read": true, 00:21:07.289 "write": true, 00:21:07.289 "unmap": true, 00:21:07.289 "flush": true, 00:21:07.289 "reset": true, 00:21:07.289 "nvme_admin": false, 00:21:07.289 "nvme_io": false, 00:21:07.289 "nvme_io_md": false, 00:21:07.289 "write_zeroes": true, 00:21:07.289 "zcopy": true, 00:21:07.289 "get_zone_info": false, 00:21:07.289 "zone_management": false, 00:21:07.289 "zone_append": false, 00:21:07.289 "compare": false, 00:21:07.289 "compare_and_write": false, 00:21:07.289 "abort": true, 00:21:07.289 "seek_hole": false, 00:21:07.289 "seek_data": false, 00:21:07.290 "copy": true, 00:21:07.290 "nvme_iov_md": false 00:21:07.290 }, 00:21:07.290 "memory_domains": [ 00:21:07.290 { 00:21:07.290 "dma_device_id": "system", 00:21:07.290 "dma_device_type": 1 00:21:07.290 }, 00:21:07.290 { 00:21:07.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.290 "dma_device_type": 2 00:21:07.290 } 00:21:07.290 ], 00:21:07.290 "driver_specific": {} 00:21:07.290 }' 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.290 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:07.548 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.805 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.805 "name": "BaseBdev4", 00:21:07.805 "aliases": [ 00:21:07.805 "60d93b99-1a6e-490e-913b-c94d3b8fb810" 00:21:07.805 ], 00:21:07.805 "product_name": "Malloc disk", 00:21:07.805 "block_size": 512, 00:21:07.805 "num_blocks": 65536, 00:21:07.805 "uuid": "60d93b99-1a6e-490e-913b-c94d3b8fb810", 00:21:07.805 "assigned_rate_limits": { 00:21:07.805 "rw_ios_per_sec": 0, 00:21:07.805 "rw_mbytes_per_sec": 0, 00:21:07.805 "r_mbytes_per_sec": 0, 00:21:07.805 "w_mbytes_per_sec": 0 00:21:07.805 }, 00:21:07.805 "claimed": true, 00:21:07.805 "claim_type": "exclusive_write", 00:21:07.805 "zoned": false, 00:21:07.805 "supported_io_types": { 00:21:07.805 "read": true, 00:21:07.805 "write": true, 00:21:07.805 "unmap": true, 00:21:07.805 "flush": true, 00:21:07.805 "reset": true, 00:21:07.805 "nvme_admin": false, 00:21:07.805 "nvme_io": false, 00:21:07.805 "nvme_io_md": false, 00:21:07.805 "write_zeroes": true, 00:21:07.805 "zcopy": true, 00:21:07.805 "get_zone_info": false, 00:21:07.805 "zone_management": false, 00:21:07.805 "zone_append": false, 00:21:07.805 "compare": false, 00:21:07.805 "compare_and_write": false, 00:21:07.805 "abort": true, 00:21:07.805 "seek_hole": false, 00:21:07.805 "seek_data": false, 00:21:07.805 "copy": true, 00:21:07.805 "nvme_iov_md": false 00:21:07.805 }, 00:21:07.805 "memory_domains": [ 00:21:07.805 { 00:21:07.805 "dma_device_id": "system", 00:21:07.805 "dma_device_type": 1 00:21:07.805 }, 00:21:07.805 { 00:21:07.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.805 "dma_device_type": 2 00:21:07.805 } 00:21:07.805 ], 00:21:07.805 "driver_specific": {} 00:21:07.805 }' 00:21:07.805 22:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.805 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.805 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.805 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.805 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.062 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:08.320 [2024-07-12 22:28:18.539287] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:08.321 [2024-07-12 22:28:18.539316] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:08.321 [2024-07-12 22:28:18.539378] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:08.321 [2024-07-12 22:28:18.539444] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:08.321 [2024-07-12 22:28:18.539463] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1956850 name Existed_Raid, state offline 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3501719 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3501719 ']' 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3501719 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3501719 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3501719' 00:21:08.321 killing process with pid 3501719 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3501719 00:21:08.321 [2024-07-12 22:28:18.605485] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:08.321 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3501719 00:21:08.597 [2024-07-12 22:28:18.648356] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:08.597 22:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:08.597 00:21:08.597 real 0m32.903s 00:21:08.597 user 1m0.336s 00:21:08.597 sys 0m5.899s 00:21:08.597 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:08.597 22:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:08.597 ************************************ 00:21:08.597 END TEST raid_state_function_test_sb 00:21:08.597 ************************************ 00:21:08.868 22:28:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:08.868 22:28:18 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:08.868 22:28:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:08.868 22:28:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:08.868 22:28:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:08.868 ************************************ 00:21:08.868 START TEST raid_superblock_test 00:21:08.868 ************************************ 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3506608 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3506608 /var/tmp/spdk-raid.sock 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3506608 ']' 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:08.868 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:08.868 22:28:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.868 [2024-07-12 22:28:19.010843] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:21:08.869 [2024-07-12 22:28:19.010912] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3506608 ] 00:21:08.869 [2024-07-12 22:28:19.141559] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:09.126 [2024-07-12 22:28:19.248226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:09.126 [2024-07-12 22:28:19.319707] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.126 [2024-07-12 22:28:19.319746] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:09.691 22:28:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:09.948 malloc1 00:21:09.948 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:10.205 [2024-07-12 22:28:20.354247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:10.205 [2024-07-12 22:28:20.354297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:10.205 [2024-07-12 22:28:20.354320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1300570 00:21:10.205 [2024-07-12 22:28:20.354332] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:10.205 [2024-07-12 22:28:20.356186] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:10.205 [2024-07-12 22:28:20.356218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:10.205 pt1 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:10.205 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:10.463 malloc2 00:21:10.463 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:10.721 [2024-07-12 22:28:20.829650] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:10.721 [2024-07-12 22:28:20.829698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:10.721 [2024-07-12 22:28:20.829717] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1301970 00:21:10.721 [2024-07-12 22:28:20.829729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:10.721 [2024-07-12 22:28:20.831360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:10.721 [2024-07-12 22:28:20.831389] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:10.721 pt2 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:10.721 22:28:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:10.721 malloc3 00:21:10.721 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:11.003 [2024-07-12 22:28:21.236780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:11.003 [2024-07-12 22:28:21.236828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.003 [2024-07-12 22:28:21.236846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1498340 00:21:11.003 [2024-07-12 22:28:21.236859] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.003 [2024-07-12 22:28:21.238431] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.003 [2024-07-12 22:28:21.238460] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:11.003 pt3 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:11.003 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:11.260 malloc4 00:21:11.260 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:11.517 [2024-07-12 22:28:21.642450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:11.517 [2024-07-12 22:28:21.642497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.517 [2024-07-12 22:28:21.642520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x149ac60 00:21:11.517 [2024-07-12 22:28:21.642532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.517 [2024-07-12 22:28:21.644075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.517 [2024-07-12 22:28:21.644104] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:11.517 pt4 00:21:11.517 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:11.517 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:11.517 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:11.775 [2024-07-12 22:28:21.871079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:11.775 [2024-07-12 22:28:21.872404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:11.775 [2024-07-12 22:28:21.872459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:11.775 [2024-07-12 22:28:21.872503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:11.775 [2024-07-12 22:28:21.872671] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f8530 00:21:11.775 [2024-07-12 22:28:21.872682] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:11.775 [2024-07-12 22:28:21.872878] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f6770 00:21:11.775 [2024-07-12 22:28:21.873033] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f8530 00:21:11.775 [2024-07-12 22:28:21.873044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f8530 00:21:11.775 [2024-07-12 22:28:21.873140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.775 22:28:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:11.775 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.775 "name": "raid_bdev1", 00:21:11.775 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:11.775 "strip_size_kb": 64, 00:21:11.775 "state": "online", 00:21:11.775 "raid_level": "concat", 00:21:11.775 "superblock": true, 00:21:11.775 "num_base_bdevs": 4, 00:21:11.775 "num_base_bdevs_discovered": 4, 00:21:11.775 "num_base_bdevs_operational": 4, 00:21:11.775 "base_bdevs_list": [ 00:21:11.775 { 00:21:11.775 "name": "pt1", 00:21:11.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:11.775 "is_configured": true, 00:21:11.775 "data_offset": 2048, 00:21:11.775 "data_size": 63488 00:21:11.775 }, 00:21:11.775 { 00:21:11.775 "name": "pt2", 00:21:11.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:11.775 "is_configured": true, 00:21:11.775 "data_offset": 2048, 00:21:11.775 "data_size": 63488 00:21:11.775 }, 00:21:11.775 { 00:21:11.775 "name": "pt3", 00:21:11.775 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:11.775 "is_configured": true, 00:21:11.775 "data_offset": 2048, 00:21:11.775 "data_size": 63488 00:21:11.775 }, 00:21:11.775 { 00:21:11.775 "name": "pt4", 00:21:11.775 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:11.775 "is_configured": true, 00:21:11.775 "data_offset": 2048, 00:21:11.775 "data_size": 63488 00:21:11.775 } 00:21:11.775 ] 00:21:11.775 }' 00:21:11.775 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.775 22:28:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:12.340 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:12.598 [2024-07-12 22:28:22.861958] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:12.598 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:12.598 "name": "raid_bdev1", 00:21:12.598 "aliases": [ 00:21:12.598 "920338fe-02a1-4e9c-b149-c43f200816a6" 00:21:12.598 ], 00:21:12.598 "product_name": "Raid Volume", 00:21:12.598 "block_size": 512, 00:21:12.598 "num_blocks": 253952, 00:21:12.598 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:12.598 "assigned_rate_limits": { 00:21:12.598 "rw_ios_per_sec": 0, 00:21:12.598 "rw_mbytes_per_sec": 0, 00:21:12.598 "r_mbytes_per_sec": 0, 00:21:12.598 "w_mbytes_per_sec": 0 00:21:12.598 }, 00:21:12.598 "claimed": false, 00:21:12.598 "zoned": false, 00:21:12.598 "supported_io_types": { 00:21:12.598 "read": true, 00:21:12.598 "write": true, 00:21:12.598 "unmap": true, 00:21:12.598 "flush": true, 00:21:12.598 "reset": true, 00:21:12.598 "nvme_admin": false, 00:21:12.598 "nvme_io": false, 00:21:12.598 "nvme_io_md": false, 00:21:12.598 "write_zeroes": true, 00:21:12.598 "zcopy": false, 00:21:12.598 "get_zone_info": false, 00:21:12.598 "zone_management": false, 00:21:12.598 "zone_append": false, 00:21:12.598 "compare": false, 00:21:12.598 "compare_and_write": false, 00:21:12.598 "abort": false, 00:21:12.598 "seek_hole": false, 00:21:12.598 "seek_data": false, 00:21:12.598 "copy": false, 00:21:12.598 "nvme_iov_md": false 00:21:12.598 }, 00:21:12.598 "memory_domains": [ 00:21:12.598 { 00:21:12.598 "dma_device_id": "system", 00:21:12.598 "dma_device_type": 1 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.598 "dma_device_type": 2 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "system", 00:21:12.598 "dma_device_type": 1 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.598 "dma_device_type": 2 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "system", 00:21:12.598 "dma_device_type": 1 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.598 "dma_device_type": 2 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "system", 00:21:12.598 "dma_device_type": 1 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.598 "dma_device_type": 2 00:21:12.598 } 00:21:12.598 ], 00:21:12.598 "driver_specific": { 00:21:12.598 "raid": { 00:21:12.598 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:12.598 "strip_size_kb": 64, 00:21:12.598 "state": "online", 00:21:12.598 "raid_level": "concat", 00:21:12.598 "superblock": true, 00:21:12.598 "num_base_bdevs": 4, 00:21:12.598 "num_base_bdevs_discovered": 4, 00:21:12.598 "num_base_bdevs_operational": 4, 00:21:12.598 "base_bdevs_list": [ 00:21:12.598 { 00:21:12.598 "name": "pt1", 00:21:12.598 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:12.598 "is_configured": true, 00:21:12.598 "data_offset": 2048, 00:21:12.598 "data_size": 63488 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "name": "pt2", 00:21:12.598 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:12.598 "is_configured": true, 00:21:12.598 "data_offset": 2048, 00:21:12.598 "data_size": 63488 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "name": "pt3", 00:21:12.598 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:12.598 "is_configured": true, 00:21:12.598 "data_offset": 2048, 00:21:12.598 "data_size": 63488 00:21:12.598 }, 00:21:12.598 { 00:21:12.598 "name": "pt4", 00:21:12.598 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:12.598 "is_configured": true, 00:21:12.598 "data_offset": 2048, 00:21:12.598 "data_size": 63488 00:21:12.598 } 00:21:12.598 ] 00:21:12.598 } 00:21:12.598 } 00:21:12.598 }' 00:21:12.598 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:12.856 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:12.856 pt2 00:21:12.856 pt3 00:21:12.856 pt4' 00:21:12.856 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.856 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:12.856 22:28:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.112 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.112 "name": "pt1", 00:21:13.112 "aliases": [ 00:21:13.112 "00000000-0000-0000-0000-000000000001" 00:21:13.112 ], 00:21:13.112 "product_name": "passthru", 00:21:13.112 "block_size": 512, 00:21:13.112 "num_blocks": 65536, 00:21:13.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:13.112 "assigned_rate_limits": { 00:21:13.112 "rw_ios_per_sec": 0, 00:21:13.112 "rw_mbytes_per_sec": 0, 00:21:13.112 "r_mbytes_per_sec": 0, 00:21:13.112 "w_mbytes_per_sec": 0 00:21:13.112 }, 00:21:13.112 "claimed": true, 00:21:13.112 "claim_type": "exclusive_write", 00:21:13.112 "zoned": false, 00:21:13.112 "supported_io_types": { 00:21:13.112 "read": true, 00:21:13.112 "write": true, 00:21:13.112 "unmap": true, 00:21:13.112 "flush": true, 00:21:13.112 "reset": true, 00:21:13.112 "nvme_admin": false, 00:21:13.112 "nvme_io": false, 00:21:13.112 "nvme_io_md": false, 00:21:13.112 "write_zeroes": true, 00:21:13.113 "zcopy": true, 00:21:13.113 "get_zone_info": false, 00:21:13.113 "zone_management": false, 00:21:13.113 "zone_append": false, 00:21:13.113 "compare": false, 00:21:13.113 "compare_and_write": false, 00:21:13.113 "abort": true, 00:21:13.113 "seek_hole": false, 00:21:13.113 "seek_data": false, 00:21:13.113 "copy": true, 00:21:13.113 "nvme_iov_md": false 00:21:13.113 }, 00:21:13.113 "memory_domains": [ 00:21:13.113 { 00:21:13.113 "dma_device_id": "system", 00:21:13.113 "dma_device_type": 1 00:21:13.113 }, 00:21:13.113 { 00:21:13.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.113 "dma_device_type": 2 00:21:13.113 } 00:21:13.113 ], 00:21:13.113 "driver_specific": { 00:21:13.113 "passthru": { 00:21:13.113 "name": "pt1", 00:21:13.113 "base_bdev_name": "malloc1" 00:21:13.113 } 00:21:13.113 } 00:21:13.113 }' 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.113 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:13.369 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.627 "name": "pt2", 00:21:13.627 "aliases": [ 00:21:13.627 "00000000-0000-0000-0000-000000000002" 00:21:13.627 ], 00:21:13.627 "product_name": "passthru", 00:21:13.627 "block_size": 512, 00:21:13.627 "num_blocks": 65536, 00:21:13.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:13.627 "assigned_rate_limits": { 00:21:13.627 "rw_ios_per_sec": 0, 00:21:13.627 "rw_mbytes_per_sec": 0, 00:21:13.627 "r_mbytes_per_sec": 0, 00:21:13.627 "w_mbytes_per_sec": 0 00:21:13.627 }, 00:21:13.627 "claimed": true, 00:21:13.627 "claim_type": "exclusive_write", 00:21:13.627 "zoned": false, 00:21:13.627 "supported_io_types": { 00:21:13.627 "read": true, 00:21:13.627 "write": true, 00:21:13.627 "unmap": true, 00:21:13.627 "flush": true, 00:21:13.627 "reset": true, 00:21:13.627 "nvme_admin": false, 00:21:13.627 "nvme_io": false, 00:21:13.627 "nvme_io_md": false, 00:21:13.627 "write_zeroes": true, 00:21:13.627 "zcopy": true, 00:21:13.627 "get_zone_info": false, 00:21:13.627 "zone_management": false, 00:21:13.627 "zone_append": false, 00:21:13.627 "compare": false, 00:21:13.627 "compare_and_write": false, 00:21:13.627 "abort": true, 00:21:13.627 "seek_hole": false, 00:21:13.627 "seek_data": false, 00:21:13.627 "copy": true, 00:21:13.627 "nvme_iov_md": false 00:21:13.627 }, 00:21:13.627 "memory_domains": [ 00:21:13.627 { 00:21:13.627 "dma_device_id": "system", 00:21:13.627 "dma_device_type": 1 00:21:13.627 }, 00:21:13.627 { 00:21:13.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.627 "dma_device_type": 2 00:21:13.627 } 00:21:13.627 ], 00:21:13.627 "driver_specific": { 00:21:13.627 "passthru": { 00:21:13.627 "name": "pt2", 00:21:13.627 "base_bdev_name": "malloc2" 00:21:13.627 } 00:21:13.627 } 00:21:13.627 }' 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.627 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.885 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.885 22:28:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:13.885 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.143 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.143 "name": "pt3", 00:21:14.143 "aliases": [ 00:21:14.143 "00000000-0000-0000-0000-000000000003" 00:21:14.143 ], 00:21:14.143 "product_name": "passthru", 00:21:14.143 "block_size": 512, 00:21:14.143 "num_blocks": 65536, 00:21:14.143 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:14.143 "assigned_rate_limits": { 00:21:14.143 "rw_ios_per_sec": 0, 00:21:14.143 "rw_mbytes_per_sec": 0, 00:21:14.143 "r_mbytes_per_sec": 0, 00:21:14.143 "w_mbytes_per_sec": 0 00:21:14.143 }, 00:21:14.143 "claimed": true, 00:21:14.143 "claim_type": "exclusive_write", 00:21:14.143 "zoned": false, 00:21:14.143 "supported_io_types": { 00:21:14.143 "read": true, 00:21:14.143 "write": true, 00:21:14.143 "unmap": true, 00:21:14.143 "flush": true, 00:21:14.143 "reset": true, 00:21:14.143 "nvme_admin": false, 00:21:14.143 "nvme_io": false, 00:21:14.143 "nvme_io_md": false, 00:21:14.143 "write_zeroes": true, 00:21:14.143 "zcopy": true, 00:21:14.143 "get_zone_info": false, 00:21:14.143 "zone_management": false, 00:21:14.143 "zone_append": false, 00:21:14.143 "compare": false, 00:21:14.143 "compare_and_write": false, 00:21:14.143 "abort": true, 00:21:14.143 "seek_hole": false, 00:21:14.143 "seek_data": false, 00:21:14.143 "copy": true, 00:21:14.143 "nvme_iov_md": false 00:21:14.143 }, 00:21:14.143 "memory_domains": [ 00:21:14.143 { 00:21:14.143 "dma_device_id": "system", 00:21:14.143 "dma_device_type": 1 00:21:14.143 }, 00:21:14.143 { 00:21:14.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.143 "dma_device_type": 2 00:21:14.143 } 00:21:14.143 ], 00:21:14.143 "driver_specific": { 00:21:14.143 "passthru": { 00:21:14.143 "name": "pt3", 00:21:14.143 "base_bdev_name": "malloc3" 00:21:14.143 } 00:21:14.143 } 00:21:14.143 }' 00:21:14.143 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.143 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.143 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.401 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:14.658 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:14.658 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:14.658 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:14.658 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.916 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.916 "name": "pt4", 00:21:14.916 "aliases": [ 00:21:14.916 "00000000-0000-0000-0000-000000000004" 00:21:14.916 ], 00:21:14.916 "product_name": "passthru", 00:21:14.916 "block_size": 512, 00:21:14.916 "num_blocks": 65536, 00:21:14.916 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:14.916 "assigned_rate_limits": { 00:21:14.916 "rw_ios_per_sec": 0, 00:21:14.916 "rw_mbytes_per_sec": 0, 00:21:14.916 "r_mbytes_per_sec": 0, 00:21:14.916 "w_mbytes_per_sec": 0 00:21:14.916 }, 00:21:14.916 "claimed": true, 00:21:14.916 "claim_type": "exclusive_write", 00:21:14.916 "zoned": false, 00:21:14.916 "supported_io_types": { 00:21:14.916 "read": true, 00:21:14.916 "write": true, 00:21:14.916 "unmap": true, 00:21:14.916 "flush": true, 00:21:14.916 "reset": true, 00:21:14.916 "nvme_admin": false, 00:21:14.916 "nvme_io": false, 00:21:14.916 "nvme_io_md": false, 00:21:14.916 "write_zeroes": true, 00:21:14.916 "zcopy": true, 00:21:14.916 "get_zone_info": false, 00:21:14.916 "zone_management": false, 00:21:14.916 "zone_append": false, 00:21:14.916 "compare": false, 00:21:14.916 "compare_and_write": false, 00:21:14.916 "abort": true, 00:21:14.916 "seek_hole": false, 00:21:14.916 "seek_data": false, 00:21:14.916 "copy": true, 00:21:14.916 "nvme_iov_md": false 00:21:14.916 }, 00:21:14.916 "memory_domains": [ 00:21:14.916 { 00:21:14.916 "dma_device_id": "system", 00:21:14.916 "dma_device_type": 1 00:21:14.916 }, 00:21:14.916 { 00:21:14.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.916 "dma_device_type": 2 00:21:14.916 } 00:21:14.916 ], 00:21:14.916 "driver_specific": { 00:21:14.916 "passthru": { 00:21:14.916 "name": "pt4", 00:21:14.916 "base_bdev_name": "malloc4" 00:21:14.916 } 00:21:14.916 } 00:21:14.916 }' 00:21:14.916 22:28:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:14.916 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:15.173 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:15.430 [2024-07-12 22:28:25.581205] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:15.430 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=920338fe-02a1-4e9c-b149-c43f200816a6 00:21:15.430 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 920338fe-02a1-4e9c-b149-c43f200816a6 ']' 00:21:15.430 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:15.687 [2024-07-12 22:28:25.821517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:15.687 [2024-07-12 22:28:25.821538] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:15.687 [2024-07-12 22:28:25.821589] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:15.687 [2024-07-12 22:28:25.821652] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:15.687 [2024-07-12 22:28:25.821664] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f8530 name raid_bdev1, state offline 00:21:15.687 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.687 22:28:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:15.945 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:15.945 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:15.945 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:15.945 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:16.203 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:16.203 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:16.461 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:16.461 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:16.718 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:16.718 22:28:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:16.976 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:16.976 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:17.234 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:17.234 [2024-07-12 22:28:27.546008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:17.234 [2024-07-12 22:28:27.547351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:17.234 [2024-07-12 22:28:27.547394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:17.234 [2024-07-12 22:28:27.547427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:17.234 [2024-07-12 22:28:27.547472] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:17.234 [2024-07-12 22:28:27.547512] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:17.234 [2024-07-12 22:28:27.547535] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:17.234 [2024-07-12 22:28:27.547557] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:17.234 [2024-07-12 22:28:27.547576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:17.234 [2024-07-12 22:28:27.547587] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a3ff0 name raid_bdev1, state configuring 00:21:17.234 request: 00:21:17.234 { 00:21:17.234 "name": "raid_bdev1", 00:21:17.234 "raid_level": "concat", 00:21:17.234 "base_bdevs": [ 00:21:17.234 "malloc1", 00:21:17.234 "malloc2", 00:21:17.234 "malloc3", 00:21:17.234 "malloc4" 00:21:17.234 ], 00:21:17.234 "strip_size_kb": 64, 00:21:17.234 "superblock": false, 00:21:17.234 "method": "bdev_raid_create", 00:21:17.234 "req_id": 1 00:21:17.234 } 00:21:17.234 Got JSON-RPC error response 00:21:17.234 response: 00:21:17.234 { 00:21:17.234 "code": -17, 00:21:17.234 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:17.234 } 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:17.492 22:28:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:17.750 [2024-07-12 22:28:28.039250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:17.750 [2024-07-12 22:28:28.039298] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.750 [2024-07-12 22:28:28.039320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13007a0 00:21:17.750 [2024-07-12 22:28:28.039338] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.750 [2024-07-12 22:28:28.040973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.750 [2024-07-12 22:28:28.041003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:17.750 [2024-07-12 22:28:28.041068] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:17.750 [2024-07-12 22:28:28.041096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:17.750 pt1 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.750 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.008 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.008 "name": "raid_bdev1", 00:21:18.008 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:18.008 "strip_size_kb": 64, 00:21:18.008 "state": "configuring", 00:21:18.008 "raid_level": "concat", 00:21:18.008 "superblock": true, 00:21:18.008 "num_base_bdevs": 4, 00:21:18.009 "num_base_bdevs_discovered": 1, 00:21:18.009 "num_base_bdevs_operational": 4, 00:21:18.009 "base_bdevs_list": [ 00:21:18.009 { 00:21:18.009 "name": "pt1", 00:21:18.009 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:18.009 "is_configured": true, 00:21:18.009 "data_offset": 2048, 00:21:18.009 "data_size": 63488 00:21:18.009 }, 00:21:18.009 { 00:21:18.009 "name": null, 00:21:18.009 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.009 "is_configured": false, 00:21:18.009 "data_offset": 2048, 00:21:18.009 "data_size": 63488 00:21:18.009 }, 00:21:18.009 { 00:21:18.009 "name": null, 00:21:18.009 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:18.009 "is_configured": false, 00:21:18.009 "data_offset": 2048, 00:21:18.009 "data_size": 63488 00:21:18.009 }, 00:21:18.009 { 00:21:18.009 "name": null, 00:21:18.009 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:18.009 "is_configured": false, 00:21:18.009 "data_offset": 2048, 00:21:18.009 "data_size": 63488 00:21:18.009 } 00:21:18.009 ] 00:21:18.009 }' 00:21:18.009 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.009 22:28:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.573 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:18.573 22:28:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:18.831 [2024-07-12 22:28:29.118099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:18.831 [2024-07-12 22:28:29.118149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:18.831 [2024-07-12 22:28:29.118168] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f7ea0 00:21:18.831 [2024-07-12 22:28:29.118180] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:18.831 [2024-07-12 22:28:29.118529] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:18.831 [2024-07-12 22:28:29.118548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:18.831 [2024-07-12 22:28:29.118608] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:18.831 [2024-07-12 22:28:29.118636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:18.831 pt2 00:21:18.831 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:19.088 [2024-07-12 22:28:29.366763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.088 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.346 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.346 "name": "raid_bdev1", 00:21:19.346 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:19.346 "strip_size_kb": 64, 00:21:19.346 "state": "configuring", 00:21:19.346 "raid_level": "concat", 00:21:19.346 "superblock": true, 00:21:19.346 "num_base_bdevs": 4, 00:21:19.346 "num_base_bdevs_discovered": 1, 00:21:19.346 "num_base_bdevs_operational": 4, 00:21:19.346 "base_bdevs_list": [ 00:21:19.346 { 00:21:19.346 "name": "pt1", 00:21:19.346 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:19.346 "is_configured": true, 00:21:19.346 "data_offset": 2048, 00:21:19.346 "data_size": 63488 00:21:19.346 }, 00:21:19.346 { 00:21:19.346 "name": null, 00:21:19.346 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:19.346 "is_configured": false, 00:21:19.346 "data_offset": 2048, 00:21:19.346 "data_size": 63488 00:21:19.346 }, 00:21:19.346 { 00:21:19.346 "name": null, 00:21:19.346 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:19.346 "is_configured": false, 00:21:19.346 "data_offset": 2048, 00:21:19.346 "data_size": 63488 00:21:19.346 }, 00:21:19.346 { 00:21:19.346 "name": null, 00:21:19.346 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:19.346 "is_configured": false, 00:21:19.346 "data_offset": 2048, 00:21:19.346 "data_size": 63488 00:21:19.346 } 00:21:19.346 ] 00:21:19.346 }' 00:21:19.346 22:28:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.346 22:28:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.911 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:19.911 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:19.911 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:20.169 [2024-07-12 22:28:30.441597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:20.169 [2024-07-12 22:28:30.441652] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:20.169 [2024-07-12 22:28:30.441673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f6ec0 00:21:20.169 [2024-07-12 22:28:30.441685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:20.169 [2024-07-12 22:28:30.442043] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:20.169 [2024-07-12 22:28:30.442065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:20.169 [2024-07-12 22:28:30.442132] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:20.169 [2024-07-12 22:28:30.442158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:20.169 pt2 00:21:20.169 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:20.169 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:20.169 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:20.427 [2024-07-12 22:28:30.682235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:20.427 [2024-07-12 22:28:30.682273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:20.427 [2024-07-12 22:28:30.682291] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f70f0 00:21:20.427 [2024-07-12 22:28:30.682303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:20.427 [2024-07-12 22:28:30.682618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:20.427 [2024-07-12 22:28:30.682636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:20.427 [2024-07-12 22:28:30.682691] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:20.427 [2024-07-12 22:28:30.682709] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:20.427 pt3 00:21:20.427 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:20.427 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:20.427 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:20.685 [2024-07-12 22:28:30.922872] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:20.685 [2024-07-12 22:28:30.922915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:20.685 [2024-07-12 22:28:30.922942] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ffaf0 00:21:20.685 [2024-07-12 22:28:30.922955] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:20.685 [2024-07-12 22:28:30.923265] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:20.685 [2024-07-12 22:28:30.923283] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:20.685 [2024-07-12 22:28:30.923339] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:20.685 [2024-07-12 22:28:30.923358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:20.685 [2024-07-12 22:28:30.923479] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f98f0 00:21:20.685 [2024-07-12 22:28:30.923490] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:20.685 [2024-07-12 22:28:30.923658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f9150 00:21:20.685 [2024-07-12 22:28:30.923787] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f98f0 00:21:20.685 [2024-07-12 22:28:30.923797] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f98f0 00:21:20.685 [2024-07-12 22:28:30.923897] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.685 pt4 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.685 22:28:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.943 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.943 "name": "raid_bdev1", 00:21:20.944 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:20.944 "strip_size_kb": 64, 00:21:20.944 "state": "online", 00:21:20.944 "raid_level": "concat", 00:21:20.944 "superblock": true, 00:21:20.944 "num_base_bdevs": 4, 00:21:20.944 "num_base_bdevs_discovered": 4, 00:21:20.944 "num_base_bdevs_operational": 4, 00:21:20.944 "base_bdevs_list": [ 00:21:20.944 { 00:21:20.944 "name": "pt1", 00:21:20.944 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:20.944 "is_configured": true, 00:21:20.944 "data_offset": 2048, 00:21:20.944 "data_size": 63488 00:21:20.944 }, 00:21:20.944 { 00:21:20.944 "name": "pt2", 00:21:20.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:20.944 "is_configured": true, 00:21:20.944 "data_offset": 2048, 00:21:20.944 "data_size": 63488 00:21:20.944 }, 00:21:20.944 { 00:21:20.944 "name": "pt3", 00:21:20.944 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:20.944 "is_configured": true, 00:21:20.944 "data_offset": 2048, 00:21:20.944 "data_size": 63488 00:21:20.944 }, 00:21:20.944 { 00:21:20.944 "name": "pt4", 00:21:20.944 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:20.944 "is_configured": true, 00:21:20.944 "data_offset": 2048, 00:21:20.944 "data_size": 63488 00:21:20.944 } 00:21:20.944 ] 00:21:20.944 }' 00:21:20.944 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.944 22:28:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:21.509 22:28:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:21.767 [2024-07-12 22:28:32.006083] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:21.767 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:21.767 "name": "raid_bdev1", 00:21:21.767 "aliases": [ 00:21:21.767 "920338fe-02a1-4e9c-b149-c43f200816a6" 00:21:21.767 ], 00:21:21.767 "product_name": "Raid Volume", 00:21:21.767 "block_size": 512, 00:21:21.767 "num_blocks": 253952, 00:21:21.767 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:21.767 "assigned_rate_limits": { 00:21:21.767 "rw_ios_per_sec": 0, 00:21:21.767 "rw_mbytes_per_sec": 0, 00:21:21.767 "r_mbytes_per_sec": 0, 00:21:21.767 "w_mbytes_per_sec": 0 00:21:21.767 }, 00:21:21.767 "claimed": false, 00:21:21.767 "zoned": false, 00:21:21.767 "supported_io_types": { 00:21:21.767 "read": true, 00:21:21.767 "write": true, 00:21:21.767 "unmap": true, 00:21:21.767 "flush": true, 00:21:21.767 "reset": true, 00:21:21.767 "nvme_admin": false, 00:21:21.767 "nvme_io": false, 00:21:21.767 "nvme_io_md": false, 00:21:21.767 "write_zeroes": true, 00:21:21.767 "zcopy": false, 00:21:21.767 "get_zone_info": false, 00:21:21.767 "zone_management": false, 00:21:21.767 "zone_append": false, 00:21:21.767 "compare": false, 00:21:21.767 "compare_and_write": false, 00:21:21.767 "abort": false, 00:21:21.767 "seek_hole": false, 00:21:21.767 "seek_data": false, 00:21:21.767 "copy": false, 00:21:21.767 "nvme_iov_md": false 00:21:21.767 }, 00:21:21.767 "memory_domains": [ 00:21:21.767 { 00:21:21.767 "dma_device_id": "system", 00:21:21.767 "dma_device_type": 1 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.767 "dma_device_type": 2 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "system", 00:21:21.767 "dma_device_type": 1 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.767 "dma_device_type": 2 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "system", 00:21:21.767 "dma_device_type": 1 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.767 "dma_device_type": 2 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "system", 00:21:21.767 "dma_device_type": 1 00:21:21.767 }, 00:21:21.767 { 00:21:21.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.767 "dma_device_type": 2 00:21:21.767 } 00:21:21.767 ], 00:21:21.767 "driver_specific": { 00:21:21.767 "raid": { 00:21:21.767 "uuid": "920338fe-02a1-4e9c-b149-c43f200816a6", 00:21:21.767 "strip_size_kb": 64, 00:21:21.767 "state": "online", 00:21:21.767 "raid_level": "concat", 00:21:21.767 "superblock": true, 00:21:21.767 "num_base_bdevs": 4, 00:21:21.767 "num_base_bdevs_discovered": 4, 00:21:21.767 "num_base_bdevs_operational": 4, 00:21:21.767 "base_bdevs_list": [ 00:21:21.767 { 00:21:21.768 "name": "pt1", 00:21:21.768 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:21.768 "is_configured": true, 00:21:21.768 "data_offset": 2048, 00:21:21.768 "data_size": 63488 00:21:21.768 }, 00:21:21.768 { 00:21:21.768 "name": "pt2", 00:21:21.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:21.768 "is_configured": true, 00:21:21.768 "data_offset": 2048, 00:21:21.768 "data_size": 63488 00:21:21.768 }, 00:21:21.768 { 00:21:21.768 "name": "pt3", 00:21:21.768 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:21.768 "is_configured": true, 00:21:21.768 "data_offset": 2048, 00:21:21.768 "data_size": 63488 00:21:21.768 }, 00:21:21.768 { 00:21:21.768 "name": "pt4", 00:21:21.768 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:21.768 "is_configured": true, 00:21:21.768 "data_offset": 2048, 00:21:21.768 "data_size": 63488 00:21:21.768 } 00:21:21.768 ] 00:21:21.768 } 00:21:21.768 } 00:21:21.768 }' 00:21:21.768 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:21.768 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:21.768 pt2 00:21:21.768 pt3 00:21:21.768 pt4' 00:21:21.768 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.768 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:21.768 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.026 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.026 "name": "pt1", 00:21:22.026 "aliases": [ 00:21:22.026 "00000000-0000-0000-0000-000000000001" 00:21:22.026 ], 00:21:22.026 "product_name": "passthru", 00:21:22.026 "block_size": 512, 00:21:22.026 "num_blocks": 65536, 00:21:22.026 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:22.026 "assigned_rate_limits": { 00:21:22.026 "rw_ios_per_sec": 0, 00:21:22.026 "rw_mbytes_per_sec": 0, 00:21:22.026 "r_mbytes_per_sec": 0, 00:21:22.026 "w_mbytes_per_sec": 0 00:21:22.026 }, 00:21:22.026 "claimed": true, 00:21:22.026 "claim_type": "exclusive_write", 00:21:22.026 "zoned": false, 00:21:22.026 "supported_io_types": { 00:21:22.026 "read": true, 00:21:22.026 "write": true, 00:21:22.026 "unmap": true, 00:21:22.026 "flush": true, 00:21:22.026 "reset": true, 00:21:22.026 "nvme_admin": false, 00:21:22.026 "nvme_io": false, 00:21:22.026 "nvme_io_md": false, 00:21:22.026 "write_zeroes": true, 00:21:22.026 "zcopy": true, 00:21:22.026 "get_zone_info": false, 00:21:22.026 "zone_management": false, 00:21:22.026 "zone_append": false, 00:21:22.026 "compare": false, 00:21:22.026 "compare_and_write": false, 00:21:22.026 "abort": true, 00:21:22.026 "seek_hole": false, 00:21:22.026 "seek_data": false, 00:21:22.026 "copy": true, 00:21:22.026 "nvme_iov_md": false 00:21:22.026 }, 00:21:22.026 "memory_domains": [ 00:21:22.026 { 00:21:22.026 "dma_device_id": "system", 00:21:22.026 "dma_device_type": 1 00:21:22.026 }, 00:21:22.026 { 00:21:22.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.026 "dma_device_type": 2 00:21:22.026 } 00:21:22.026 ], 00:21:22.026 "driver_specific": { 00:21:22.026 "passthru": { 00:21:22.026 "name": "pt1", 00:21:22.026 "base_bdev_name": "malloc1" 00:21:22.026 } 00:21:22.026 } 00:21:22.026 }' 00:21:22.026 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.283 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.541 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.541 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.541 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.541 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:22.541 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.798 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.798 "name": "pt2", 00:21:22.798 "aliases": [ 00:21:22.798 "00000000-0000-0000-0000-000000000002" 00:21:22.798 ], 00:21:22.798 "product_name": "passthru", 00:21:22.798 "block_size": 512, 00:21:22.798 "num_blocks": 65536, 00:21:22.798 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:22.798 "assigned_rate_limits": { 00:21:22.798 "rw_ios_per_sec": 0, 00:21:22.798 "rw_mbytes_per_sec": 0, 00:21:22.798 "r_mbytes_per_sec": 0, 00:21:22.798 "w_mbytes_per_sec": 0 00:21:22.798 }, 00:21:22.798 "claimed": true, 00:21:22.798 "claim_type": "exclusive_write", 00:21:22.798 "zoned": false, 00:21:22.798 "supported_io_types": { 00:21:22.798 "read": true, 00:21:22.798 "write": true, 00:21:22.798 "unmap": true, 00:21:22.798 "flush": true, 00:21:22.798 "reset": true, 00:21:22.798 "nvme_admin": false, 00:21:22.798 "nvme_io": false, 00:21:22.798 "nvme_io_md": false, 00:21:22.798 "write_zeroes": true, 00:21:22.798 "zcopy": true, 00:21:22.798 "get_zone_info": false, 00:21:22.798 "zone_management": false, 00:21:22.798 "zone_append": false, 00:21:22.798 "compare": false, 00:21:22.798 "compare_and_write": false, 00:21:22.798 "abort": true, 00:21:22.798 "seek_hole": false, 00:21:22.798 "seek_data": false, 00:21:22.798 "copy": true, 00:21:22.798 "nvme_iov_md": false 00:21:22.798 }, 00:21:22.798 "memory_domains": [ 00:21:22.798 { 00:21:22.798 "dma_device_id": "system", 00:21:22.798 "dma_device_type": 1 00:21:22.798 }, 00:21:22.798 { 00:21:22.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.798 "dma_device_type": 2 00:21:22.798 } 00:21:22.798 ], 00:21:22.798 "driver_specific": { 00:21:22.798 "passthru": { 00:21:22.798 "name": "pt2", 00:21:22.798 "base_bdev_name": "malloc2" 00:21:22.798 } 00:21:22.798 } 00:21:22.798 }' 00:21:22.798 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.798 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.798 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.798 22:28:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.798 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.798 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.798 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.055 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.055 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.055 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.056 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.056 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.056 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.056 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:23.056 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:23.314 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:23.314 "name": "pt3", 00:21:23.314 "aliases": [ 00:21:23.314 "00000000-0000-0000-0000-000000000003" 00:21:23.314 ], 00:21:23.314 "product_name": "passthru", 00:21:23.314 "block_size": 512, 00:21:23.314 "num_blocks": 65536, 00:21:23.314 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:23.314 "assigned_rate_limits": { 00:21:23.314 "rw_ios_per_sec": 0, 00:21:23.314 "rw_mbytes_per_sec": 0, 00:21:23.314 "r_mbytes_per_sec": 0, 00:21:23.314 "w_mbytes_per_sec": 0 00:21:23.314 }, 00:21:23.314 "claimed": true, 00:21:23.314 "claim_type": "exclusive_write", 00:21:23.314 "zoned": false, 00:21:23.314 "supported_io_types": { 00:21:23.314 "read": true, 00:21:23.314 "write": true, 00:21:23.314 "unmap": true, 00:21:23.314 "flush": true, 00:21:23.314 "reset": true, 00:21:23.314 "nvme_admin": false, 00:21:23.314 "nvme_io": false, 00:21:23.314 "nvme_io_md": false, 00:21:23.314 "write_zeroes": true, 00:21:23.314 "zcopy": true, 00:21:23.314 "get_zone_info": false, 00:21:23.314 "zone_management": false, 00:21:23.314 "zone_append": false, 00:21:23.314 "compare": false, 00:21:23.314 "compare_and_write": false, 00:21:23.314 "abort": true, 00:21:23.314 "seek_hole": false, 00:21:23.314 "seek_data": false, 00:21:23.314 "copy": true, 00:21:23.314 "nvme_iov_md": false 00:21:23.314 }, 00:21:23.314 "memory_domains": [ 00:21:23.314 { 00:21:23.314 "dma_device_id": "system", 00:21:23.314 "dma_device_type": 1 00:21:23.314 }, 00:21:23.314 { 00:21:23.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.314 "dma_device_type": 2 00:21:23.314 } 00:21:23.314 ], 00:21:23.314 "driver_specific": { 00:21:23.314 "passthru": { 00:21:23.314 "name": "pt3", 00:21:23.314 "base_bdev_name": "malloc3" 00:21:23.314 } 00:21:23.314 } 00:21:23.314 }' 00:21:23.314 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.314 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.314 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:23.314 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:23.572 22:28:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:23.829 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:23.829 "name": "pt4", 00:21:23.829 "aliases": [ 00:21:23.829 "00000000-0000-0000-0000-000000000004" 00:21:23.829 ], 00:21:23.829 "product_name": "passthru", 00:21:23.829 "block_size": 512, 00:21:23.829 "num_blocks": 65536, 00:21:23.829 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:23.829 "assigned_rate_limits": { 00:21:23.829 "rw_ios_per_sec": 0, 00:21:23.829 "rw_mbytes_per_sec": 0, 00:21:23.829 "r_mbytes_per_sec": 0, 00:21:23.829 "w_mbytes_per_sec": 0 00:21:23.829 }, 00:21:23.829 "claimed": true, 00:21:23.829 "claim_type": "exclusive_write", 00:21:23.829 "zoned": false, 00:21:23.829 "supported_io_types": { 00:21:23.829 "read": true, 00:21:23.829 "write": true, 00:21:23.829 "unmap": true, 00:21:23.829 "flush": true, 00:21:23.829 "reset": true, 00:21:23.829 "nvme_admin": false, 00:21:23.829 "nvme_io": false, 00:21:23.829 "nvme_io_md": false, 00:21:23.829 "write_zeroes": true, 00:21:23.829 "zcopy": true, 00:21:23.829 "get_zone_info": false, 00:21:23.829 "zone_management": false, 00:21:23.829 "zone_append": false, 00:21:23.829 "compare": false, 00:21:23.829 "compare_and_write": false, 00:21:23.829 "abort": true, 00:21:23.829 "seek_hole": false, 00:21:23.829 "seek_data": false, 00:21:23.829 "copy": true, 00:21:23.829 "nvme_iov_md": false 00:21:23.829 }, 00:21:23.829 "memory_domains": [ 00:21:23.829 { 00:21:23.829 "dma_device_id": "system", 00:21:23.829 "dma_device_type": 1 00:21:23.829 }, 00:21:23.829 { 00:21:23.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.829 "dma_device_type": 2 00:21:23.829 } 00:21:23.829 ], 00:21:23.829 "driver_specific": { 00:21:23.829 "passthru": { 00:21:23.829 "name": "pt4", 00:21:23.829 "base_bdev_name": "malloc4" 00:21:23.829 } 00:21:23.829 } 00:21:23.829 }' 00:21:23.829 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.829 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.086 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.344 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:24.344 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:24.344 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:24.344 [2024-07-12 22:28:34.665151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 920338fe-02a1-4e9c-b149-c43f200816a6 '!=' 920338fe-02a1-4e9c-b149-c43f200816a6 ']' 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3506608 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3506608 ']' 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3506608 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3506608 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3506608' 00:21:24.602 killing process with pid 3506608 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3506608 00:21:24.602 [2024-07-12 22:28:34.733906] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:24.602 [2024-07-12 22:28:34.733978] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:24.602 [2024-07-12 22:28:34.734041] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:24.602 [2024-07-12 22:28:34.734054] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f98f0 name raid_bdev1, state offline 00:21:24.602 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3506608 00:21:24.602 [2024-07-12 22:28:34.770943] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:24.877 22:28:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:24.877 00:21:24.877 real 0m16.026s 00:21:24.877 user 0m28.999s 00:21:24.877 sys 0m2.841s 00:21:24.877 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:24.877 22:28:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.877 ************************************ 00:21:24.877 END TEST raid_superblock_test 00:21:24.877 ************************************ 00:21:24.877 22:28:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:24.877 22:28:35 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:24.877 22:28:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:24.877 22:28:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:24.877 22:28:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:24.877 ************************************ 00:21:24.877 START TEST raid_read_error_test 00:21:24.877 ************************************ 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6qKNgoHM0m 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3509035 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3509035 /var/tmp/spdk-raid.sock 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3509035 ']' 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:24.878 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:24.878 22:28:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.878 [2024-07-12 22:28:35.126532] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:21:24.878 [2024-07-12 22:28:35.126597] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3509035 ] 00:21:25.147 [2024-07-12 22:28:35.253042] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:25.147 [2024-07-12 22:28:35.357394] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:25.147 [2024-07-12 22:28:35.419019] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.147 [2024-07-12 22:28:35.419056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:26.080 22:28:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:26.080 22:28:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:26.080 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:26.080 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:26.080 BaseBdev1_malloc 00:21:26.080 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:26.338 true 00:21:26.338 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:26.596 [2024-07-12 22:28:36.783557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:26.596 [2024-07-12 22:28:36.783603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.596 [2024-07-12 22:28:36.783623] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13eb0d0 00:21:26.596 [2024-07-12 22:28:36.783635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.596 [2024-07-12 22:28:36.785494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.596 [2024-07-12 22:28:36.785526] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:26.596 BaseBdev1 00:21:26.596 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:26.596 22:28:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:26.853 BaseBdev2_malloc 00:21:26.853 22:28:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:27.111 true 00:21:27.111 22:28:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:27.368 [2024-07-12 22:28:37.510106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:27.368 [2024-07-12 22:28:37.510151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.368 [2024-07-12 22:28:37.510172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ef910 00:21:27.368 [2024-07-12 22:28:37.510185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.368 [2024-07-12 22:28:37.511778] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.368 [2024-07-12 22:28:37.511806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:27.368 BaseBdev2 00:21:27.369 22:28:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:27.369 22:28:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:27.626 BaseBdev3_malloc 00:21:27.626 22:28:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:27.895 true 00:21:27.895 22:28:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:28.154 [2024-07-12 22:28:38.256642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:28.154 [2024-07-12 22:28:38.256690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.154 [2024-07-12 22:28:38.256712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f1bd0 00:21:28.154 [2024-07-12 22:28:38.256725] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.154 [2024-07-12 22:28:38.258363] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.154 [2024-07-12 22:28:38.258395] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:28.154 BaseBdev3 00:21:28.154 22:28:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:28.154 22:28:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:28.411 BaseBdev4_malloc 00:21:28.411 22:28:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:28.669 true 00:21:28.669 22:28:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:28.669 [2024-07-12 22:28:38.992021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:28.669 [2024-07-12 22:28:38.992066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.669 [2024-07-12 22:28:38.992086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f2aa0 00:21:28.669 [2024-07-12 22:28:38.992099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.669 [2024-07-12 22:28:38.993605] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.669 [2024-07-12 22:28:38.993634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:28.926 BaseBdev4 00:21:28.926 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:28.926 [2024-07-12 22:28:39.248727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.926 [2024-07-12 22:28:39.250004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:28.926 [2024-07-12 22:28:39.250072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:28.926 [2024-07-12 22:28:39.250134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:28.926 [2024-07-12 22:28:39.250367] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13ecc20 00:21:28.926 [2024-07-12 22:28:39.250378] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:28.926 [2024-07-12 22:28:39.250566] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1241260 00:21:28.926 [2024-07-12 22:28:39.250713] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13ecc20 00:21:28.926 [2024-07-12 22:28:39.250722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13ecc20 00:21:28.926 [2024-07-12 22:28:39.250824] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.184 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.442 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.442 "name": "raid_bdev1", 00:21:29.442 "uuid": "49b893a7-67e1-4590-9a00-9092fd137ad7", 00:21:29.442 "strip_size_kb": 64, 00:21:29.442 "state": "online", 00:21:29.442 "raid_level": "concat", 00:21:29.442 "superblock": true, 00:21:29.442 "num_base_bdevs": 4, 00:21:29.442 "num_base_bdevs_discovered": 4, 00:21:29.442 "num_base_bdevs_operational": 4, 00:21:29.442 "base_bdevs_list": [ 00:21:29.442 { 00:21:29.442 "name": "BaseBdev1", 00:21:29.442 "uuid": "8921995b-c605-57ce-b6bc-a39d5f2591d9", 00:21:29.442 "is_configured": true, 00:21:29.442 "data_offset": 2048, 00:21:29.442 "data_size": 63488 00:21:29.442 }, 00:21:29.442 { 00:21:29.442 "name": "BaseBdev2", 00:21:29.442 "uuid": "6df5f6e2-dcf7-5aa2-8c24-6c040934a004", 00:21:29.442 "is_configured": true, 00:21:29.442 "data_offset": 2048, 00:21:29.443 "data_size": 63488 00:21:29.443 }, 00:21:29.443 { 00:21:29.443 "name": "BaseBdev3", 00:21:29.443 "uuid": "9d8d9fd4-f147-538d-9d74-710d6faec9be", 00:21:29.443 "is_configured": true, 00:21:29.443 "data_offset": 2048, 00:21:29.443 "data_size": 63488 00:21:29.443 }, 00:21:29.443 { 00:21:29.443 "name": "BaseBdev4", 00:21:29.443 "uuid": "904620c5-ca2e-5ead-ad3e-f84988c02030", 00:21:29.443 "is_configured": true, 00:21:29.443 "data_offset": 2048, 00:21:29.443 "data_size": 63488 00:21:29.443 } 00:21:29.443 ] 00:21:29.443 }' 00:21:29.443 22:28:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.443 22:28:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:30.008 22:28:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:30.008 22:28:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:30.008 [2024-07-12 22:28:40.227595] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13defc0 00:21:30.941 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.199 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.458 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.458 "name": "raid_bdev1", 00:21:31.458 "uuid": "49b893a7-67e1-4590-9a00-9092fd137ad7", 00:21:31.458 "strip_size_kb": 64, 00:21:31.458 "state": "online", 00:21:31.458 "raid_level": "concat", 00:21:31.458 "superblock": true, 00:21:31.458 "num_base_bdevs": 4, 00:21:31.458 "num_base_bdevs_discovered": 4, 00:21:31.458 "num_base_bdevs_operational": 4, 00:21:31.458 "base_bdevs_list": [ 00:21:31.458 { 00:21:31.458 "name": "BaseBdev1", 00:21:31.458 "uuid": "8921995b-c605-57ce-b6bc-a39d5f2591d9", 00:21:31.458 "is_configured": true, 00:21:31.458 "data_offset": 2048, 00:21:31.458 "data_size": 63488 00:21:31.458 }, 00:21:31.458 { 00:21:31.458 "name": "BaseBdev2", 00:21:31.458 "uuid": "6df5f6e2-dcf7-5aa2-8c24-6c040934a004", 00:21:31.458 "is_configured": true, 00:21:31.458 "data_offset": 2048, 00:21:31.458 "data_size": 63488 00:21:31.458 }, 00:21:31.458 { 00:21:31.458 "name": "BaseBdev3", 00:21:31.458 "uuid": "9d8d9fd4-f147-538d-9d74-710d6faec9be", 00:21:31.458 "is_configured": true, 00:21:31.458 "data_offset": 2048, 00:21:31.458 "data_size": 63488 00:21:31.458 }, 00:21:31.458 { 00:21:31.458 "name": "BaseBdev4", 00:21:31.458 "uuid": "904620c5-ca2e-5ead-ad3e-f84988c02030", 00:21:31.458 "is_configured": true, 00:21:31.458 "data_offset": 2048, 00:21:31.458 "data_size": 63488 00:21:31.458 } 00:21:31.458 ] 00:21:31.458 }' 00:21:31.458 22:28:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.458 22:28:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.023 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:32.590 [2024-07-12 22:28:42.643186] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:32.590 [2024-07-12 22:28:42.643222] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:32.590 [2024-07-12 22:28:42.646400] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:32.590 [2024-07-12 22:28:42.646438] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.590 [2024-07-12 22:28:42.646478] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:32.590 [2024-07-12 22:28:42.646490] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ecc20 name raid_bdev1, state offline 00:21:32.590 0 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3509035 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3509035 ']' 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3509035 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3509035 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3509035' 00:21:32.590 killing process with pid 3509035 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3509035 00:21:32.590 [2024-07-12 22:28:42.725508] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:32.590 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3509035 00:21:32.590 [2024-07-12 22:28:42.756769] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6qKNgoHM0m 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:21:32.849 00:21:32.849 real 0m7.932s 00:21:32.849 user 0m12.821s 00:21:32.849 sys 0m1.353s 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:32.849 22:28:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.849 ************************************ 00:21:32.849 END TEST raid_read_error_test 00:21:32.849 ************************************ 00:21:32.849 22:28:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:32.849 22:28:43 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:32.849 22:28:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:32.849 22:28:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:32.849 22:28:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:32.849 ************************************ 00:21:32.849 START TEST raid_write_error_test 00:21:32.849 ************************************ 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Mv0LI12bcd 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3510189 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3510189 /var/tmp/spdk-raid.sock 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3510189 ']' 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:32.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:32.849 22:28:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.849 [2024-07-12 22:28:43.160654] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:21:32.849 [2024-07-12 22:28:43.160726] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3510189 ] 00:21:33.108 [2024-07-12 22:28:43.290897] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:33.108 [2024-07-12 22:28:43.391779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:33.365 [2024-07-12 22:28:43.458150] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.365 [2024-07-12 22:28:43.458190] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.928 22:28:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:33.929 22:28:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:33.929 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:33.929 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:34.185 BaseBdev1_malloc 00:21:34.185 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:34.443 true 00:21:34.443 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:34.700 [2024-07-12 22:28:44.808151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:34.700 [2024-07-12 22:28:44.808197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.700 [2024-07-12 22:28:44.808217] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f10d0 00:21:34.700 [2024-07-12 22:28:44.808230] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.700 [2024-07-12 22:28:44.809948] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.700 [2024-07-12 22:28:44.809979] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:34.700 BaseBdev1 00:21:34.700 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:34.700 22:28:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:34.958 BaseBdev2_malloc 00:21:34.958 22:28:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:35.215 true 00:21:35.215 22:28:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:35.473 [2024-07-12 22:28:45.542626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:35.473 [2024-07-12 22:28:45.542670] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:35.473 [2024-07-12 22:28:45.542690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f5910 00:21:35.473 [2024-07-12 22:28:45.542703] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:35.473 [2024-07-12 22:28:45.544116] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:35.473 [2024-07-12 22:28:45.544146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:35.473 BaseBdev2 00:21:35.473 22:28:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:35.473 22:28:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:35.473 BaseBdev3_malloc 00:21:35.730 22:28:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:35.730 true 00:21:35.730 22:28:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:35.987 [2024-07-12 22:28:46.273131] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:35.987 [2024-07-12 22:28:46.273182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:35.987 [2024-07-12 22:28:46.273204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f7bd0 00:21:35.987 [2024-07-12 22:28:46.273217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:35.987 [2024-07-12 22:28:46.274725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:35.987 [2024-07-12 22:28:46.274754] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:35.987 BaseBdev3 00:21:35.987 22:28:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:35.987 22:28:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:36.244 BaseBdev4_malloc 00:21:36.244 22:28:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:36.499 true 00:21:36.499 22:28:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:36.757 [2024-07-12 22:28:47.019794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:36.757 [2024-07-12 22:28:47.019838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.757 [2024-07-12 22:28:47.019860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22f8aa0 00:21:36.757 [2024-07-12 22:28:47.019872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.757 [2024-07-12 22:28:47.021270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.757 [2024-07-12 22:28:47.021299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:36.757 BaseBdev4 00:21:36.757 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:37.015 [2024-07-12 22:28:47.264590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:37.015 [2024-07-12 22:28:47.265760] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:37.015 [2024-07-12 22:28:47.265825] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:37.015 [2024-07-12 22:28:47.265886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:37.015 [2024-07-12 22:28:47.266122] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f2c20 00:21:37.015 [2024-07-12 22:28:47.266133] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:37.015 [2024-07-12 22:28:47.266309] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2147260 00:21:37.015 [2024-07-12 22:28:47.266452] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f2c20 00:21:37.015 [2024-07-12 22:28:47.266462] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22f2c20 00:21:37.015 [2024-07-12 22:28:47.266557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.015 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.273 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.273 "name": "raid_bdev1", 00:21:37.273 "uuid": "5f83dab6-2f82-4372-ba86-1371fd34cabf", 00:21:37.273 "strip_size_kb": 64, 00:21:37.273 "state": "online", 00:21:37.273 "raid_level": "concat", 00:21:37.273 "superblock": true, 00:21:37.273 "num_base_bdevs": 4, 00:21:37.273 "num_base_bdevs_discovered": 4, 00:21:37.273 "num_base_bdevs_operational": 4, 00:21:37.273 "base_bdevs_list": [ 00:21:37.273 { 00:21:37.273 "name": "BaseBdev1", 00:21:37.273 "uuid": "81f24d5a-de66-5a75-bc2e-ba1ea3dd8817", 00:21:37.273 "is_configured": true, 00:21:37.273 "data_offset": 2048, 00:21:37.273 "data_size": 63488 00:21:37.273 }, 00:21:37.273 { 00:21:37.273 "name": "BaseBdev2", 00:21:37.273 "uuid": "68f695ae-4e63-52e9-a71a-b6a51a8e889d", 00:21:37.273 "is_configured": true, 00:21:37.273 "data_offset": 2048, 00:21:37.273 "data_size": 63488 00:21:37.273 }, 00:21:37.273 { 00:21:37.273 "name": "BaseBdev3", 00:21:37.273 "uuid": "cd5011f4-1106-556f-9255-94ce1ee969aa", 00:21:37.273 "is_configured": true, 00:21:37.273 "data_offset": 2048, 00:21:37.273 "data_size": 63488 00:21:37.273 }, 00:21:37.273 { 00:21:37.273 "name": "BaseBdev4", 00:21:37.273 "uuid": "c8b5442e-fb56-5da2-88a6-6a0f180575e8", 00:21:37.273 "is_configured": true, 00:21:37.273 "data_offset": 2048, 00:21:37.273 "data_size": 63488 00:21:37.273 } 00:21:37.273 ] 00:21:37.273 }' 00:21:37.273 22:28:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.273 22:28:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.839 22:28:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:37.839 22:28:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:38.096 [2024-07-12 22:28:48.235459] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22e4fc0 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.027 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.283 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.283 "name": "raid_bdev1", 00:21:39.283 "uuid": "5f83dab6-2f82-4372-ba86-1371fd34cabf", 00:21:39.283 "strip_size_kb": 64, 00:21:39.283 "state": "online", 00:21:39.283 "raid_level": "concat", 00:21:39.283 "superblock": true, 00:21:39.283 "num_base_bdevs": 4, 00:21:39.283 "num_base_bdevs_discovered": 4, 00:21:39.283 "num_base_bdevs_operational": 4, 00:21:39.283 "base_bdevs_list": [ 00:21:39.283 { 00:21:39.283 "name": "BaseBdev1", 00:21:39.283 "uuid": "81f24d5a-de66-5a75-bc2e-ba1ea3dd8817", 00:21:39.283 "is_configured": true, 00:21:39.283 "data_offset": 2048, 00:21:39.283 "data_size": 63488 00:21:39.283 }, 00:21:39.283 { 00:21:39.283 "name": "BaseBdev2", 00:21:39.283 "uuid": "68f695ae-4e63-52e9-a71a-b6a51a8e889d", 00:21:39.283 "is_configured": true, 00:21:39.283 "data_offset": 2048, 00:21:39.283 "data_size": 63488 00:21:39.283 }, 00:21:39.283 { 00:21:39.283 "name": "BaseBdev3", 00:21:39.283 "uuid": "cd5011f4-1106-556f-9255-94ce1ee969aa", 00:21:39.283 "is_configured": true, 00:21:39.283 "data_offset": 2048, 00:21:39.283 "data_size": 63488 00:21:39.283 }, 00:21:39.283 { 00:21:39.283 "name": "BaseBdev4", 00:21:39.283 "uuid": "c8b5442e-fb56-5da2-88a6-6a0f180575e8", 00:21:39.283 "is_configured": true, 00:21:39.283 "data_offset": 2048, 00:21:39.283 "data_size": 63488 00:21:39.283 } 00:21:39.283 ] 00:21:39.283 }' 00:21:39.283 22:28:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.283 22:28:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.846 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:40.102 [2024-07-12 22:28:50.372966] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:40.102 [2024-07-12 22:28:50.373000] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:40.102 [2024-07-12 22:28:50.376219] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:40.102 [2024-07-12 22:28:50.376257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:40.102 [2024-07-12 22:28:50.376299] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:40.103 [2024-07-12 22:28:50.376311] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f2c20 name raid_bdev1, state offline 00:21:40.103 0 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3510189 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3510189 ']' 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3510189 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:40.103 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3510189 00:21:40.359 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:40.359 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:40.359 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3510189' 00:21:40.359 killing process with pid 3510189 00:21:40.359 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3510189 00:21:40.359 [2024-07-12 22:28:50.442333] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:40.359 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3510189 00:21:40.359 [2024-07-12 22:28:50.474240] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Mv0LI12bcd 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:21:40.617 00:21:40.617 real 0m7.637s 00:21:40.617 user 0m12.188s 00:21:40.617 sys 0m1.354s 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:40.617 22:28:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.617 ************************************ 00:21:40.617 END TEST raid_write_error_test 00:21:40.617 ************************************ 00:21:40.617 22:28:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:40.617 22:28:50 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:40.617 22:28:50 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:40.617 22:28:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:40.617 22:28:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:40.617 22:28:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:40.617 ************************************ 00:21:40.617 START TEST raid_state_function_test 00:21:40.617 ************************************ 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=3511340 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3511340' 00:21:40.617 Process raid pid: 3511340 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 3511340 /var/tmp/spdk-raid.sock 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 3511340 ']' 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:40.617 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:40.617 22:28:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.617 [2024-07-12 22:28:50.881004] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:21:40.617 [2024-07-12 22:28:50.881073] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:40.873 [2024-07-12 22:28:51.011066] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:40.873 [2024-07-12 22:28:51.113418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:40.873 [2024-07-12 22:28:51.173358] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:40.873 [2024-07-12 22:28:51.173386] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:41.806 22:28:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:41.806 22:28:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:41.806 22:28:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:41.806 [2024-07-12 22:28:52.043302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:41.806 [2024-07-12 22:28:52.043352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:41.806 [2024-07-12 22:28:52.043363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:41.806 [2024-07-12 22:28:52.043376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:41.806 [2024-07-12 22:28:52.043389] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:41.806 [2024-07-12 22:28:52.043401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:41.806 [2024-07-12 22:28:52.043410] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:41.806 [2024-07-12 22:28:52.043422] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.806 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.433 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.433 "name": "Existed_Raid", 00:21:42.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.433 "strip_size_kb": 0, 00:21:42.433 "state": "configuring", 00:21:42.433 "raid_level": "raid1", 00:21:42.433 "superblock": false, 00:21:42.433 "num_base_bdevs": 4, 00:21:42.433 "num_base_bdevs_discovered": 0, 00:21:42.433 "num_base_bdevs_operational": 4, 00:21:42.433 "base_bdevs_list": [ 00:21:42.433 { 00:21:42.433 "name": "BaseBdev1", 00:21:42.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.433 "is_configured": false, 00:21:42.433 "data_offset": 0, 00:21:42.433 "data_size": 0 00:21:42.433 }, 00:21:42.433 { 00:21:42.433 "name": "BaseBdev2", 00:21:42.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.433 "is_configured": false, 00:21:42.434 "data_offset": 0, 00:21:42.434 "data_size": 0 00:21:42.434 }, 00:21:42.434 { 00:21:42.434 "name": "BaseBdev3", 00:21:42.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.434 "is_configured": false, 00:21:42.434 "data_offset": 0, 00:21:42.434 "data_size": 0 00:21:42.434 }, 00:21:42.434 { 00:21:42.434 "name": "BaseBdev4", 00:21:42.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.434 "is_configured": false, 00:21:42.434 "data_offset": 0, 00:21:42.434 "data_size": 0 00:21:42.434 } 00:21:42.434 ] 00:21:42.434 }' 00:21:42.434 22:28:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.434 22:28:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.995 22:28:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:43.253 [2024-07-12 22:28:53.378671] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:43.253 [2024-07-12 22:28:53.378705] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe29aa0 name Existed_Raid, state configuring 00:21:43.253 22:28:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:43.511 [2024-07-12 22:28:53.623334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:43.511 [2024-07-12 22:28:53.623367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:43.511 [2024-07-12 22:28:53.623377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:43.511 [2024-07-12 22:28:53.623397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:43.511 [2024-07-12 22:28:53.623406] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:43.511 [2024-07-12 22:28:53.623418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:43.511 [2024-07-12 22:28:53.623427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:43.511 [2024-07-12 22:28:53.623438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:43.511 22:28:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:43.769 [2024-07-12 22:28:53.879049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:43.769 BaseBdev1 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:43.769 22:28:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.027 22:28:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:44.285 [ 00:21:44.285 { 00:21:44.285 "name": "BaseBdev1", 00:21:44.285 "aliases": [ 00:21:44.285 "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2" 00:21:44.285 ], 00:21:44.285 "product_name": "Malloc disk", 00:21:44.285 "block_size": 512, 00:21:44.285 "num_blocks": 65536, 00:21:44.285 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:44.285 "assigned_rate_limits": { 00:21:44.285 "rw_ios_per_sec": 0, 00:21:44.285 "rw_mbytes_per_sec": 0, 00:21:44.285 "r_mbytes_per_sec": 0, 00:21:44.285 "w_mbytes_per_sec": 0 00:21:44.285 }, 00:21:44.285 "claimed": true, 00:21:44.285 "claim_type": "exclusive_write", 00:21:44.285 "zoned": false, 00:21:44.285 "supported_io_types": { 00:21:44.285 "read": true, 00:21:44.285 "write": true, 00:21:44.285 "unmap": true, 00:21:44.285 "flush": true, 00:21:44.285 "reset": true, 00:21:44.285 "nvme_admin": false, 00:21:44.285 "nvme_io": false, 00:21:44.285 "nvme_io_md": false, 00:21:44.285 "write_zeroes": true, 00:21:44.285 "zcopy": true, 00:21:44.285 "get_zone_info": false, 00:21:44.285 "zone_management": false, 00:21:44.285 "zone_append": false, 00:21:44.285 "compare": false, 00:21:44.285 "compare_and_write": false, 00:21:44.285 "abort": true, 00:21:44.285 "seek_hole": false, 00:21:44.285 "seek_data": false, 00:21:44.285 "copy": true, 00:21:44.285 "nvme_iov_md": false 00:21:44.285 }, 00:21:44.285 "memory_domains": [ 00:21:44.285 { 00:21:44.285 "dma_device_id": "system", 00:21:44.285 "dma_device_type": 1 00:21:44.285 }, 00:21:44.285 { 00:21:44.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.285 "dma_device_type": 2 00:21:44.285 } 00:21:44.285 ], 00:21:44.285 "driver_specific": {} 00:21:44.285 } 00:21:44.285 ] 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.285 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.543 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.543 "name": "Existed_Raid", 00:21:44.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.543 "strip_size_kb": 0, 00:21:44.543 "state": "configuring", 00:21:44.543 "raid_level": "raid1", 00:21:44.543 "superblock": false, 00:21:44.543 "num_base_bdevs": 4, 00:21:44.543 "num_base_bdevs_discovered": 1, 00:21:44.543 "num_base_bdevs_operational": 4, 00:21:44.543 "base_bdevs_list": [ 00:21:44.543 { 00:21:44.543 "name": "BaseBdev1", 00:21:44.544 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:44.544 "is_configured": true, 00:21:44.544 "data_offset": 0, 00:21:44.544 "data_size": 65536 00:21:44.544 }, 00:21:44.544 { 00:21:44.544 "name": "BaseBdev2", 00:21:44.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.544 "is_configured": false, 00:21:44.544 "data_offset": 0, 00:21:44.544 "data_size": 0 00:21:44.544 }, 00:21:44.544 { 00:21:44.544 "name": "BaseBdev3", 00:21:44.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.544 "is_configured": false, 00:21:44.544 "data_offset": 0, 00:21:44.544 "data_size": 0 00:21:44.544 }, 00:21:44.544 { 00:21:44.544 "name": "BaseBdev4", 00:21:44.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.544 "is_configured": false, 00:21:44.544 "data_offset": 0, 00:21:44.544 "data_size": 0 00:21:44.544 } 00:21:44.544 ] 00:21:44.544 }' 00:21:44.544 22:28:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.544 22:28:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:45.109 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:45.366 [2024-07-12 22:28:55.443182] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:45.366 [2024-07-12 22:28:55.443230] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe29310 name Existed_Raid, state configuring 00:21:45.367 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:45.367 [2024-07-12 22:28:55.687852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:45.367 [2024-07-12 22:28:55.689369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:45.367 [2024-07-12 22:28:55.689402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:45.367 [2024-07-12 22:28:55.689413] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:45.367 [2024-07-12 22:28:55.689425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:45.367 [2024-07-12 22:28:55.689434] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:45.367 [2024-07-12 22:28:55.689445] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.624 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.882 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.882 "name": "Existed_Raid", 00:21:45.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.882 "strip_size_kb": 0, 00:21:45.882 "state": "configuring", 00:21:45.882 "raid_level": "raid1", 00:21:45.882 "superblock": false, 00:21:45.882 "num_base_bdevs": 4, 00:21:45.882 "num_base_bdevs_discovered": 1, 00:21:45.882 "num_base_bdevs_operational": 4, 00:21:45.882 "base_bdevs_list": [ 00:21:45.882 { 00:21:45.882 "name": "BaseBdev1", 00:21:45.882 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:45.882 "is_configured": true, 00:21:45.883 "data_offset": 0, 00:21:45.883 "data_size": 65536 00:21:45.883 }, 00:21:45.883 { 00:21:45.883 "name": "BaseBdev2", 00:21:45.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.883 "is_configured": false, 00:21:45.883 "data_offset": 0, 00:21:45.883 "data_size": 0 00:21:45.883 }, 00:21:45.883 { 00:21:45.883 "name": "BaseBdev3", 00:21:45.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.883 "is_configured": false, 00:21:45.883 "data_offset": 0, 00:21:45.883 "data_size": 0 00:21:45.883 }, 00:21:45.883 { 00:21:45.883 "name": "BaseBdev4", 00:21:45.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.883 "is_configured": false, 00:21:45.883 "data_offset": 0, 00:21:45.883 "data_size": 0 00:21:45.883 } 00:21:45.883 ] 00:21:45.883 }' 00:21:45.883 22:28:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.883 22:28:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.448 22:28:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:46.705 [2024-07-12 22:28:56.774151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:46.705 BaseBdev2 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:46.705 22:28:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:46.962 [ 00:21:46.962 { 00:21:46.962 "name": "BaseBdev2", 00:21:46.962 "aliases": [ 00:21:46.962 "3655a16a-9911-4f35-bc52-96eecd89cd1c" 00:21:46.962 ], 00:21:46.962 "product_name": "Malloc disk", 00:21:46.962 "block_size": 512, 00:21:46.962 "num_blocks": 65536, 00:21:46.962 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:46.962 "assigned_rate_limits": { 00:21:46.962 "rw_ios_per_sec": 0, 00:21:46.962 "rw_mbytes_per_sec": 0, 00:21:46.962 "r_mbytes_per_sec": 0, 00:21:46.962 "w_mbytes_per_sec": 0 00:21:46.962 }, 00:21:46.962 "claimed": true, 00:21:46.962 "claim_type": "exclusive_write", 00:21:46.962 "zoned": false, 00:21:46.962 "supported_io_types": { 00:21:46.962 "read": true, 00:21:46.962 "write": true, 00:21:46.962 "unmap": true, 00:21:46.962 "flush": true, 00:21:46.962 "reset": true, 00:21:46.962 "nvme_admin": false, 00:21:46.962 "nvme_io": false, 00:21:46.962 "nvme_io_md": false, 00:21:46.962 "write_zeroes": true, 00:21:46.962 "zcopy": true, 00:21:46.962 "get_zone_info": false, 00:21:46.962 "zone_management": false, 00:21:46.962 "zone_append": false, 00:21:46.962 "compare": false, 00:21:46.962 "compare_and_write": false, 00:21:46.962 "abort": true, 00:21:46.962 "seek_hole": false, 00:21:46.962 "seek_data": false, 00:21:46.962 "copy": true, 00:21:46.962 "nvme_iov_md": false 00:21:46.962 }, 00:21:46.962 "memory_domains": [ 00:21:46.962 { 00:21:46.962 "dma_device_id": "system", 00:21:46.962 "dma_device_type": 1 00:21:46.962 }, 00:21:46.962 { 00:21:46.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.962 "dma_device_type": 2 00:21:46.962 } 00:21:46.962 ], 00:21:46.962 "driver_specific": {} 00:21:46.962 } 00:21:46.962 ] 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.962 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.219 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.219 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.219 "name": "Existed_Raid", 00:21:47.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.219 "strip_size_kb": 0, 00:21:47.219 "state": "configuring", 00:21:47.219 "raid_level": "raid1", 00:21:47.219 "superblock": false, 00:21:47.219 "num_base_bdevs": 4, 00:21:47.219 "num_base_bdevs_discovered": 2, 00:21:47.219 "num_base_bdevs_operational": 4, 00:21:47.219 "base_bdevs_list": [ 00:21:47.219 { 00:21:47.219 "name": "BaseBdev1", 00:21:47.219 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:47.219 "is_configured": true, 00:21:47.219 "data_offset": 0, 00:21:47.219 "data_size": 65536 00:21:47.219 }, 00:21:47.219 { 00:21:47.219 "name": "BaseBdev2", 00:21:47.219 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:47.219 "is_configured": true, 00:21:47.219 "data_offset": 0, 00:21:47.219 "data_size": 65536 00:21:47.219 }, 00:21:47.219 { 00:21:47.219 "name": "BaseBdev3", 00:21:47.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.219 "is_configured": false, 00:21:47.219 "data_offset": 0, 00:21:47.219 "data_size": 0 00:21:47.219 }, 00:21:47.219 { 00:21:47.219 "name": "BaseBdev4", 00:21:47.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.219 "is_configured": false, 00:21:47.219 "data_offset": 0, 00:21:47.219 "data_size": 0 00:21:47.219 } 00:21:47.219 ] 00:21:47.219 }' 00:21:47.219 22:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.219 22:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:48.151 [2024-07-12 22:28:58.337759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:48.151 BaseBdev3 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:48.151 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.408 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:48.666 [ 00:21:48.666 { 00:21:48.666 "name": "BaseBdev3", 00:21:48.666 "aliases": [ 00:21:48.666 "3a8155a3-7a79-4d91-b769-389bba591d9d" 00:21:48.666 ], 00:21:48.666 "product_name": "Malloc disk", 00:21:48.666 "block_size": 512, 00:21:48.666 "num_blocks": 65536, 00:21:48.666 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:48.666 "assigned_rate_limits": { 00:21:48.666 "rw_ios_per_sec": 0, 00:21:48.666 "rw_mbytes_per_sec": 0, 00:21:48.666 "r_mbytes_per_sec": 0, 00:21:48.666 "w_mbytes_per_sec": 0 00:21:48.666 }, 00:21:48.666 "claimed": true, 00:21:48.666 "claim_type": "exclusive_write", 00:21:48.666 "zoned": false, 00:21:48.666 "supported_io_types": { 00:21:48.666 "read": true, 00:21:48.666 "write": true, 00:21:48.666 "unmap": true, 00:21:48.666 "flush": true, 00:21:48.666 "reset": true, 00:21:48.666 "nvme_admin": false, 00:21:48.666 "nvme_io": false, 00:21:48.666 "nvme_io_md": false, 00:21:48.666 "write_zeroes": true, 00:21:48.666 "zcopy": true, 00:21:48.666 "get_zone_info": false, 00:21:48.666 "zone_management": false, 00:21:48.666 "zone_append": false, 00:21:48.666 "compare": false, 00:21:48.666 "compare_and_write": false, 00:21:48.666 "abort": true, 00:21:48.666 "seek_hole": false, 00:21:48.666 "seek_data": false, 00:21:48.666 "copy": true, 00:21:48.666 "nvme_iov_md": false 00:21:48.666 }, 00:21:48.666 "memory_domains": [ 00:21:48.666 { 00:21:48.666 "dma_device_id": "system", 00:21:48.666 "dma_device_type": 1 00:21:48.666 }, 00:21:48.666 { 00:21:48.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.666 "dma_device_type": 2 00:21:48.666 } 00:21:48.666 ], 00:21:48.666 "driver_specific": {} 00:21:48.666 } 00:21:48.666 ] 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.666 22:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.923 22:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.923 "name": "Existed_Raid", 00:21:48.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.923 "strip_size_kb": 0, 00:21:48.923 "state": "configuring", 00:21:48.923 "raid_level": "raid1", 00:21:48.923 "superblock": false, 00:21:48.923 "num_base_bdevs": 4, 00:21:48.923 "num_base_bdevs_discovered": 3, 00:21:48.923 "num_base_bdevs_operational": 4, 00:21:48.923 "base_bdevs_list": [ 00:21:48.923 { 00:21:48.923 "name": "BaseBdev1", 00:21:48.923 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:48.923 "is_configured": true, 00:21:48.923 "data_offset": 0, 00:21:48.923 "data_size": 65536 00:21:48.923 }, 00:21:48.923 { 00:21:48.923 "name": "BaseBdev2", 00:21:48.923 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:48.923 "is_configured": true, 00:21:48.923 "data_offset": 0, 00:21:48.923 "data_size": 65536 00:21:48.923 }, 00:21:48.923 { 00:21:48.923 "name": "BaseBdev3", 00:21:48.923 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:48.923 "is_configured": true, 00:21:48.923 "data_offset": 0, 00:21:48.923 "data_size": 65536 00:21:48.923 }, 00:21:48.923 { 00:21:48.923 "name": "BaseBdev4", 00:21:48.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.923 "is_configured": false, 00:21:48.923 "data_offset": 0, 00:21:48.923 "data_size": 0 00:21:48.923 } 00:21:48.923 ] 00:21:48.923 }' 00:21:48.923 22:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.924 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.488 22:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:49.746 [2024-07-12 22:28:59.905351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:49.746 [2024-07-12 22:28:59.905392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe2a350 00:21:49.746 [2024-07-12 22:28:59.905401] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:49.746 [2024-07-12 22:28:59.905657] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2a020 00:21:49.746 [2024-07-12 22:28:59.905788] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe2a350 00:21:49.746 [2024-07-12 22:28:59.905798] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe2a350 00:21:49.746 [2024-07-12 22:28:59.905986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:49.746 BaseBdev4 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:49.746 22:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:50.003 22:29:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:50.260 [ 00:21:50.260 { 00:21:50.260 "name": "BaseBdev4", 00:21:50.260 "aliases": [ 00:21:50.260 "9bc67df5-005c-4192-a534-3786824377e3" 00:21:50.260 ], 00:21:50.260 "product_name": "Malloc disk", 00:21:50.260 "block_size": 512, 00:21:50.260 "num_blocks": 65536, 00:21:50.260 "uuid": "9bc67df5-005c-4192-a534-3786824377e3", 00:21:50.260 "assigned_rate_limits": { 00:21:50.260 "rw_ios_per_sec": 0, 00:21:50.260 "rw_mbytes_per_sec": 0, 00:21:50.260 "r_mbytes_per_sec": 0, 00:21:50.260 "w_mbytes_per_sec": 0 00:21:50.260 }, 00:21:50.260 "claimed": true, 00:21:50.260 "claim_type": "exclusive_write", 00:21:50.260 "zoned": false, 00:21:50.260 "supported_io_types": { 00:21:50.260 "read": true, 00:21:50.260 "write": true, 00:21:50.260 "unmap": true, 00:21:50.260 "flush": true, 00:21:50.260 "reset": true, 00:21:50.260 "nvme_admin": false, 00:21:50.260 "nvme_io": false, 00:21:50.260 "nvme_io_md": false, 00:21:50.260 "write_zeroes": true, 00:21:50.260 "zcopy": true, 00:21:50.260 "get_zone_info": false, 00:21:50.260 "zone_management": false, 00:21:50.260 "zone_append": false, 00:21:50.260 "compare": false, 00:21:50.260 "compare_and_write": false, 00:21:50.260 "abort": true, 00:21:50.260 "seek_hole": false, 00:21:50.260 "seek_data": false, 00:21:50.260 "copy": true, 00:21:50.260 "nvme_iov_md": false 00:21:50.260 }, 00:21:50.260 "memory_domains": [ 00:21:50.261 { 00:21:50.261 "dma_device_id": "system", 00:21:50.261 "dma_device_type": 1 00:21:50.261 }, 00:21:50.261 { 00:21:50.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.261 "dma_device_type": 2 00:21:50.261 } 00:21:50.261 ], 00:21:50.261 "driver_specific": {} 00:21:50.261 } 00:21:50.261 ] 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.261 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.518 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.518 "name": "Existed_Raid", 00:21:50.518 "uuid": "d44950a4-c6e2-4930-aab5-2b3fc3f7b2f4", 00:21:50.518 "strip_size_kb": 0, 00:21:50.518 "state": "online", 00:21:50.518 "raid_level": "raid1", 00:21:50.518 "superblock": false, 00:21:50.518 "num_base_bdevs": 4, 00:21:50.518 "num_base_bdevs_discovered": 4, 00:21:50.518 "num_base_bdevs_operational": 4, 00:21:50.518 "base_bdevs_list": [ 00:21:50.518 { 00:21:50.518 "name": "BaseBdev1", 00:21:50.518 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:50.518 "is_configured": true, 00:21:50.518 "data_offset": 0, 00:21:50.518 "data_size": 65536 00:21:50.518 }, 00:21:50.518 { 00:21:50.518 "name": "BaseBdev2", 00:21:50.518 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:50.518 "is_configured": true, 00:21:50.518 "data_offset": 0, 00:21:50.518 "data_size": 65536 00:21:50.518 }, 00:21:50.518 { 00:21:50.518 "name": "BaseBdev3", 00:21:50.518 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:50.518 "is_configured": true, 00:21:50.518 "data_offset": 0, 00:21:50.518 "data_size": 65536 00:21:50.518 }, 00:21:50.518 { 00:21:50.518 "name": "BaseBdev4", 00:21:50.518 "uuid": "9bc67df5-005c-4192-a534-3786824377e3", 00:21:50.518 "is_configured": true, 00:21:50.518 "data_offset": 0, 00:21:50.518 "data_size": 65536 00:21:50.518 } 00:21:50.518 ] 00:21:50.518 }' 00:21:50.518 22:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.518 22:29:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:51.082 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:51.082 [2024-07-12 22:29:01.385623] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:51.340 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:51.340 "name": "Existed_Raid", 00:21:51.340 "aliases": [ 00:21:51.340 "d44950a4-c6e2-4930-aab5-2b3fc3f7b2f4" 00:21:51.340 ], 00:21:51.340 "product_name": "Raid Volume", 00:21:51.340 "block_size": 512, 00:21:51.340 "num_blocks": 65536, 00:21:51.340 "uuid": "d44950a4-c6e2-4930-aab5-2b3fc3f7b2f4", 00:21:51.340 "assigned_rate_limits": { 00:21:51.340 "rw_ios_per_sec": 0, 00:21:51.340 "rw_mbytes_per_sec": 0, 00:21:51.340 "r_mbytes_per_sec": 0, 00:21:51.340 "w_mbytes_per_sec": 0 00:21:51.340 }, 00:21:51.340 "claimed": false, 00:21:51.340 "zoned": false, 00:21:51.340 "supported_io_types": { 00:21:51.340 "read": true, 00:21:51.340 "write": true, 00:21:51.340 "unmap": false, 00:21:51.340 "flush": false, 00:21:51.340 "reset": true, 00:21:51.340 "nvme_admin": false, 00:21:51.340 "nvme_io": false, 00:21:51.340 "nvme_io_md": false, 00:21:51.340 "write_zeroes": true, 00:21:51.340 "zcopy": false, 00:21:51.340 "get_zone_info": false, 00:21:51.340 "zone_management": false, 00:21:51.340 "zone_append": false, 00:21:51.340 "compare": false, 00:21:51.340 "compare_and_write": false, 00:21:51.340 "abort": false, 00:21:51.340 "seek_hole": false, 00:21:51.340 "seek_data": false, 00:21:51.340 "copy": false, 00:21:51.340 "nvme_iov_md": false 00:21:51.340 }, 00:21:51.340 "memory_domains": [ 00:21:51.340 { 00:21:51.340 "dma_device_id": "system", 00:21:51.340 "dma_device_type": 1 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.340 "dma_device_type": 2 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "system", 00:21:51.340 "dma_device_type": 1 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.340 "dma_device_type": 2 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "system", 00:21:51.340 "dma_device_type": 1 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.340 "dma_device_type": 2 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "system", 00:21:51.340 "dma_device_type": 1 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.340 "dma_device_type": 2 00:21:51.340 } 00:21:51.340 ], 00:21:51.340 "driver_specific": { 00:21:51.340 "raid": { 00:21:51.340 "uuid": "d44950a4-c6e2-4930-aab5-2b3fc3f7b2f4", 00:21:51.340 "strip_size_kb": 0, 00:21:51.340 "state": "online", 00:21:51.340 "raid_level": "raid1", 00:21:51.340 "superblock": false, 00:21:51.340 "num_base_bdevs": 4, 00:21:51.340 "num_base_bdevs_discovered": 4, 00:21:51.340 "num_base_bdevs_operational": 4, 00:21:51.340 "base_bdevs_list": [ 00:21:51.340 { 00:21:51.340 "name": "BaseBdev1", 00:21:51.340 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:51.340 "is_configured": true, 00:21:51.340 "data_offset": 0, 00:21:51.340 "data_size": 65536 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "name": "BaseBdev2", 00:21:51.340 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:51.340 "is_configured": true, 00:21:51.340 "data_offset": 0, 00:21:51.340 "data_size": 65536 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "name": "BaseBdev3", 00:21:51.340 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:51.340 "is_configured": true, 00:21:51.340 "data_offset": 0, 00:21:51.340 "data_size": 65536 00:21:51.340 }, 00:21:51.340 { 00:21:51.340 "name": "BaseBdev4", 00:21:51.340 "uuid": "9bc67df5-005c-4192-a534-3786824377e3", 00:21:51.340 "is_configured": true, 00:21:51.340 "data_offset": 0, 00:21:51.340 "data_size": 65536 00:21:51.340 } 00:21:51.340 ] 00:21:51.340 } 00:21:51.340 } 00:21:51.340 }' 00:21:51.340 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:51.340 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:51.340 BaseBdev2 00:21:51.341 BaseBdev3 00:21:51.341 BaseBdev4' 00:21:51.341 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.341 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:51.341 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:51.599 "name": "BaseBdev1", 00:21:51.599 "aliases": [ 00:21:51.599 "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2" 00:21:51.599 ], 00:21:51.599 "product_name": "Malloc disk", 00:21:51.599 "block_size": 512, 00:21:51.599 "num_blocks": 65536, 00:21:51.599 "uuid": "a22ec79a-02f5-4f6d-ac31-f5d3602b51d2", 00:21:51.599 "assigned_rate_limits": { 00:21:51.599 "rw_ios_per_sec": 0, 00:21:51.599 "rw_mbytes_per_sec": 0, 00:21:51.599 "r_mbytes_per_sec": 0, 00:21:51.599 "w_mbytes_per_sec": 0 00:21:51.599 }, 00:21:51.599 "claimed": true, 00:21:51.599 "claim_type": "exclusive_write", 00:21:51.599 "zoned": false, 00:21:51.599 "supported_io_types": { 00:21:51.599 "read": true, 00:21:51.599 "write": true, 00:21:51.599 "unmap": true, 00:21:51.599 "flush": true, 00:21:51.599 "reset": true, 00:21:51.599 "nvme_admin": false, 00:21:51.599 "nvme_io": false, 00:21:51.599 "nvme_io_md": false, 00:21:51.599 "write_zeroes": true, 00:21:51.599 "zcopy": true, 00:21:51.599 "get_zone_info": false, 00:21:51.599 "zone_management": false, 00:21:51.599 "zone_append": false, 00:21:51.599 "compare": false, 00:21:51.599 "compare_and_write": false, 00:21:51.599 "abort": true, 00:21:51.599 "seek_hole": false, 00:21:51.599 "seek_data": false, 00:21:51.599 "copy": true, 00:21:51.599 "nvme_iov_md": false 00:21:51.599 }, 00:21:51.599 "memory_domains": [ 00:21:51.599 { 00:21:51.599 "dma_device_id": "system", 00:21:51.599 "dma_device_type": 1 00:21:51.599 }, 00:21:51.599 { 00:21:51.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.599 "dma_device_type": 2 00:21:51.599 } 00:21:51.599 ], 00:21:51.599 "driver_specific": {} 00:21:51.599 }' 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.599 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.858 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:51.858 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.858 22:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.858 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:51.858 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.858 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:51.858 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:52.117 "name": "BaseBdev2", 00:21:52.117 "aliases": [ 00:21:52.117 "3655a16a-9911-4f35-bc52-96eecd89cd1c" 00:21:52.117 ], 00:21:52.117 "product_name": "Malloc disk", 00:21:52.117 "block_size": 512, 00:21:52.117 "num_blocks": 65536, 00:21:52.117 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:52.117 "assigned_rate_limits": { 00:21:52.117 "rw_ios_per_sec": 0, 00:21:52.117 "rw_mbytes_per_sec": 0, 00:21:52.117 "r_mbytes_per_sec": 0, 00:21:52.117 "w_mbytes_per_sec": 0 00:21:52.117 }, 00:21:52.117 "claimed": true, 00:21:52.117 "claim_type": "exclusive_write", 00:21:52.117 "zoned": false, 00:21:52.117 "supported_io_types": { 00:21:52.117 "read": true, 00:21:52.117 "write": true, 00:21:52.117 "unmap": true, 00:21:52.117 "flush": true, 00:21:52.117 "reset": true, 00:21:52.117 "nvme_admin": false, 00:21:52.117 "nvme_io": false, 00:21:52.117 "nvme_io_md": false, 00:21:52.117 "write_zeroes": true, 00:21:52.117 "zcopy": true, 00:21:52.117 "get_zone_info": false, 00:21:52.117 "zone_management": false, 00:21:52.117 "zone_append": false, 00:21:52.117 "compare": false, 00:21:52.117 "compare_and_write": false, 00:21:52.117 "abort": true, 00:21:52.117 "seek_hole": false, 00:21:52.117 "seek_data": false, 00:21:52.117 "copy": true, 00:21:52.117 "nvme_iov_md": false 00:21:52.117 }, 00:21:52.117 "memory_domains": [ 00:21:52.117 { 00:21:52.117 "dma_device_id": "system", 00:21:52.117 "dma_device_type": 1 00:21:52.117 }, 00:21:52.117 { 00:21:52.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.117 "dma_device_type": 2 00:21:52.117 } 00:21:52.117 ], 00:21:52.117 "driver_specific": {} 00:21:52.117 }' 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.117 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:52.374 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:52.632 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:52.632 "name": "BaseBdev3", 00:21:52.632 "aliases": [ 00:21:52.632 "3a8155a3-7a79-4d91-b769-389bba591d9d" 00:21:52.632 ], 00:21:52.632 "product_name": "Malloc disk", 00:21:52.632 "block_size": 512, 00:21:52.632 "num_blocks": 65536, 00:21:52.632 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:52.632 "assigned_rate_limits": { 00:21:52.632 "rw_ios_per_sec": 0, 00:21:52.632 "rw_mbytes_per_sec": 0, 00:21:52.632 "r_mbytes_per_sec": 0, 00:21:52.632 "w_mbytes_per_sec": 0 00:21:52.632 }, 00:21:52.632 "claimed": true, 00:21:52.632 "claim_type": "exclusive_write", 00:21:52.632 "zoned": false, 00:21:52.632 "supported_io_types": { 00:21:52.632 "read": true, 00:21:52.632 "write": true, 00:21:52.632 "unmap": true, 00:21:52.632 "flush": true, 00:21:52.632 "reset": true, 00:21:52.632 "nvme_admin": false, 00:21:52.632 "nvme_io": false, 00:21:52.632 "nvme_io_md": false, 00:21:52.632 "write_zeroes": true, 00:21:52.632 "zcopy": true, 00:21:52.632 "get_zone_info": false, 00:21:52.632 "zone_management": false, 00:21:52.632 "zone_append": false, 00:21:52.632 "compare": false, 00:21:52.632 "compare_and_write": false, 00:21:52.632 "abort": true, 00:21:52.632 "seek_hole": false, 00:21:52.632 "seek_data": false, 00:21:52.632 "copy": true, 00:21:52.632 "nvme_iov_md": false 00:21:52.632 }, 00:21:52.632 "memory_domains": [ 00:21:52.632 { 00:21:52.632 "dma_device_id": "system", 00:21:52.632 "dma_device_type": 1 00:21:52.632 }, 00:21:52.632 { 00:21:52.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.632 "dma_device_type": 2 00:21:52.632 } 00:21:52.632 ], 00:21:52.632 "driver_specific": {} 00:21:52.632 }' 00:21:52.632 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.632 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.890 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:52.890 22:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.890 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.147 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.147 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.147 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:53.147 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.405 "name": "BaseBdev4", 00:21:53.405 "aliases": [ 00:21:53.405 "9bc67df5-005c-4192-a534-3786824377e3" 00:21:53.405 ], 00:21:53.405 "product_name": "Malloc disk", 00:21:53.405 "block_size": 512, 00:21:53.405 "num_blocks": 65536, 00:21:53.405 "uuid": "9bc67df5-005c-4192-a534-3786824377e3", 00:21:53.405 "assigned_rate_limits": { 00:21:53.405 "rw_ios_per_sec": 0, 00:21:53.405 "rw_mbytes_per_sec": 0, 00:21:53.405 "r_mbytes_per_sec": 0, 00:21:53.405 "w_mbytes_per_sec": 0 00:21:53.405 }, 00:21:53.405 "claimed": true, 00:21:53.405 "claim_type": "exclusive_write", 00:21:53.405 "zoned": false, 00:21:53.405 "supported_io_types": { 00:21:53.405 "read": true, 00:21:53.405 "write": true, 00:21:53.405 "unmap": true, 00:21:53.405 "flush": true, 00:21:53.405 "reset": true, 00:21:53.405 "nvme_admin": false, 00:21:53.405 "nvme_io": false, 00:21:53.405 "nvme_io_md": false, 00:21:53.405 "write_zeroes": true, 00:21:53.405 "zcopy": true, 00:21:53.405 "get_zone_info": false, 00:21:53.405 "zone_management": false, 00:21:53.405 "zone_append": false, 00:21:53.405 "compare": false, 00:21:53.405 "compare_and_write": false, 00:21:53.405 "abort": true, 00:21:53.405 "seek_hole": false, 00:21:53.405 "seek_data": false, 00:21:53.405 "copy": true, 00:21:53.405 "nvme_iov_md": false 00:21:53.405 }, 00:21:53.405 "memory_domains": [ 00:21:53.405 { 00:21:53.405 "dma_device_id": "system", 00:21:53.405 "dma_device_type": 1 00:21:53.405 }, 00:21:53.405 { 00:21:53.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.405 "dma_device_type": 2 00:21:53.405 } 00:21:53.405 ], 00:21:53.405 "driver_specific": {} 00:21:53.405 }' 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.405 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.662 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.662 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.662 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.662 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.662 22:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:53.921 [2024-07-12 22:29:04.004282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.921 "name": "Existed_Raid", 00:21:53.921 "uuid": "d44950a4-c6e2-4930-aab5-2b3fc3f7b2f4", 00:21:53.921 "strip_size_kb": 0, 00:21:53.921 "state": "online", 00:21:53.921 "raid_level": "raid1", 00:21:53.921 "superblock": false, 00:21:53.921 "num_base_bdevs": 4, 00:21:53.921 "num_base_bdevs_discovered": 3, 00:21:53.921 "num_base_bdevs_operational": 3, 00:21:53.921 "base_bdevs_list": [ 00:21:53.921 { 00:21:53.921 "name": null, 00:21:53.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.921 "is_configured": false, 00:21:53.921 "data_offset": 0, 00:21:53.921 "data_size": 65536 00:21:53.921 }, 00:21:53.921 { 00:21:53.921 "name": "BaseBdev2", 00:21:53.921 "uuid": "3655a16a-9911-4f35-bc52-96eecd89cd1c", 00:21:53.921 "is_configured": true, 00:21:53.921 "data_offset": 0, 00:21:53.921 "data_size": 65536 00:21:53.921 }, 00:21:53.921 { 00:21:53.921 "name": "BaseBdev3", 00:21:53.921 "uuid": "3a8155a3-7a79-4d91-b769-389bba591d9d", 00:21:53.921 "is_configured": true, 00:21:53.921 "data_offset": 0, 00:21:53.921 "data_size": 65536 00:21:53.921 }, 00:21:53.921 { 00:21:53.921 "name": "BaseBdev4", 00:21:53.921 "uuid": "9bc67df5-005c-4192-a534-3786824377e3", 00:21:53.921 "is_configured": true, 00:21:53.921 "data_offset": 0, 00:21:53.921 "data_size": 65536 00:21:53.921 } 00:21:53.921 ] 00:21:53.921 }' 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.921 22:29:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.486 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:54.486 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:54.486 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.486 22:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:54.743 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:54.744 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:54.744 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:55.001 [2024-07-12 22:29:05.280691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:55.001 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:55.001 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:55.001 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.001 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:55.259 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:55.259 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:55.259 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:55.517 [2024-07-12 22:29:05.774486] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:55.517 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:55.517 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:55.517 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.517 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:55.775 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:55.775 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:55.775 22:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:56.033 [2024-07-12 22:29:06.188172] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:56.033 [2024-07-12 22:29:06.188258] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:56.033 [2024-07-12 22:29:06.199462] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:56.033 [2024-07-12 22:29:06.199496] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:56.033 [2024-07-12 22:29:06.199508] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2a350 name Existed_Raid, state offline 00:21:56.033 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:56.033 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:56.033 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.033 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:56.597 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:56.854 BaseBdev2 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:56.854 22:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:57.112 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:57.370 [ 00:21:57.370 { 00:21:57.370 "name": "BaseBdev2", 00:21:57.370 "aliases": [ 00:21:57.370 "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91" 00:21:57.370 ], 00:21:57.370 "product_name": "Malloc disk", 00:21:57.370 "block_size": 512, 00:21:57.370 "num_blocks": 65536, 00:21:57.370 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:21:57.370 "assigned_rate_limits": { 00:21:57.370 "rw_ios_per_sec": 0, 00:21:57.370 "rw_mbytes_per_sec": 0, 00:21:57.370 "r_mbytes_per_sec": 0, 00:21:57.370 "w_mbytes_per_sec": 0 00:21:57.370 }, 00:21:57.370 "claimed": false, 00:21:57.370 "zoned": false, 00:21:57.370 "supported_io_types": { 00:21:57.370 "read": true, 00:21:57.370 "write": true, 00:21:57.370 "unmap": true, 00:21:57.370 "flush": true, 00:21:57.370 "reset": true, 00:21:57.370 "nvme_admin": false, 00:21:57.370 "nvme_io": false, 00:21:57.370 "nvme_io_md": false, 00:21:57.370 "write_zeroes": true, 00:21:57.370 "zcopy": true, 00:21:57.370 "get_zone_info": false, 00:21:57.370 "zone_management": false, 00:21:57.370 "zone_append": false, 00:21:57.370 "compare": false, 00:21:57.370 "compare_and_write": false, 00:21:57.370 "abort": true, 00:21:57.370 "seek_hole": false, 00:21:57.370 "seek_data": false, 00:21:57.370 "copy": true, 00:21:57.370 "nvme_iov_md": false 00:21:57.370 }, 00:21:57.370 "memory_domains": [ 00:21:57.370 { 00:21:57.370 "dma_device_id": "system", 00:21:57.370 "dma_device_type": 1 00:21:57.370 }, 00:21:57.370 { 00:21:57.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.370 "dma_device_type": 2 00:21:57.370 } 00:21:57.370 ], 00:21:57.370 "driver_specific": {} 00:21:57.370 } 00:21:57.370 ] 00:21:57.370 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:57.370 22:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:57.370 22:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:57.370 22:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:57.628 BaseBdev3 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:57.628 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:57.886 22:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:57.886 [ 00:21:57.886 { 00:21:57.886 "name": "BaseBdev3", 00:21:57.886 "aliases": [ 00:21:57.886 "0aae71cb-7291-4a7f-9cef-a9fddf5702d8" 00:21:57.886 ], 00:21:57.886 "product_name": "Malloc disk", 00:21:57.886 "block_size": 512, 00:21:57.886 "num_blocks": 65536, 00:21:57.886 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:21:57.886 "assigned_rate_limits": { 00:21:57.886 "rw_ios_per_sec": 0, 00:21:57.886 "rw_mbytes_per_sec": 0, 00:21:57.886 "r_mbytes_per_sec": 0, 00:21:57.886 "w_mbytes_per_sec": 0 00:21:57.886 }, 00:21:57.886 "claimed": false, 00:21:57.886 "zoned": false, 00:21:57.886 "supported_io_types": { 00:21:57.886 "read": true, 00:21:57.886 "write": true, 00:21:57.886 "unmap": true, 00:21:57.886 "flush": true, 00:21:57.886 "reset": true, 00:21:57.886 "nvme_admin": false, 00:21:57.886 "nvme_io": false, 00:21:57.886 "nvme_io_md": false, 00:21:57.886 "write_zeroes": true, 00:21:57.886 "zcopy": true, 00:21:57.886 "get_zone_info": false, 00:21:57.886 "zone_management": false, 00:21:57.886 "zone_append": false, 00:21:57.886 "compare": false, 00:21:57.886 "compare_and_write": false, 00:21:57.886 "abort": true, 00:21:57.886 "seek_hole": false, 00:21:57.886 "seek_data": false, 00:21:57.886 "copy": true, 00:21:57.886 "nvme_iov_md": false 00:21:57.886 }, 00:21:57.886 "memory_domains": [ 00:21:57.886 { 00:21:57.886 "dma_device_id": "system", 00:21:57.886 "dma_device_type": 1 00:21:57.886 }, 00:21:57.886 { 00:21:57.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.886 "dma_device_type": 2 00:21:57.886 } 00:21:57.886 ], 00:21:57.886 "driver_specific": {} 00:21:57.886 } 00:21:57.886 ] 00:21:57.886 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:57.886 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:57.886 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:57.886 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:58.144 BaseBdev4 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:58.144 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.429 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:58.699 [ 00:21:58.699 { 00:21:58.699 "name": "BaseBdev4", 00:21:58.699 "aliases": [ 00:21:58.699 "d1f49418-9557-4273-a7fe-7a3b93edd6ba" 00:21:58.699 ], 00:21:58.699 "product_name": "Malloc disk", 00:21:58.699 "block_size": 512, 00:21:58.699 "num_blocks": 65536, 00:21:58.699 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:21:58.699 "assigned_rate_limits": { 00:21:58.699 "rw_ios_per_sec": 0, 00:21:58.699 "rw_mbytes_per_sec": 0, 00:21:58.699 "r_mbytes_per_sec": 0, 00:21:58.699 "w_mbytes_per_sec": 0 00:21:58.699 }, 00:21:58.699 "claimed": false, 00:21:58.699 "zoned": false, 00:21:58.699 "supported_io_types": { 00:21:58.699 "read": true, 00:21:58.699 "write": true, 00:21:58.700 "unmap": true, 00:21:58.700 "flush": true, 00:21:58.700 "reset": true, 00:21:58.700 "nvme_admin": false, 00:21:58.700 "nvme_io": false, 00:21:58.700 "nvme_io_md": false, 00:21:58.700 "write_zeroes": true, 00:21:58.700 "zcopy": true, 00:21:58.700 "get_zone_info": false, 00:21:58.700 "zone_management": false, 00:21:58.700 "zone_append": false, 00:21:58.700 "compare": false, 00:21:58.700 "compare_and_write": false, 00:21:58.700 "abort": true, 00:21:58.700 "seek_hole": false, 00:21:58.700 "seek_data": false, 00:21:58.700 "copy": true, 00:21:58.700 "nvme_iov_md": false 00:21:58.700 }, 00:21:58.700 "memory_domains": [ 00:21:58.700 { 00:21:58.700 "dma_device_id": "system", 00:21:58.700 "dma_device_type": 1 00:21:58.700 }, 00:21:58.700 { 00:21:58.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.700 "dma_device_type": 2 00:21:58.700 } 00:21:58.700 ], 00:21:58.700 "driver_specific": {} 00:21:58.700 } 00:21:58.700 ] 00:21:58.700 22:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:58.700 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:58.700 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:58.700 22:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:58.958 [2024-07-12 22:29:09.149303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:58.958 [2024-07-12 22:29:09.149349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:58.958 [2024-07-12 22:29:09.149373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:58.958 [2024-07-12 22:29:09.150762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:58.958 [2024-07-12 22:29:09.150805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.958 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:59.217 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.217 "name": "Existed_Raid", 00:21:59.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.217 "strip_size_kb": 0, 00:21:59.217 "state": "configuring", 00:21:59.217 "raid_level": "raid1", 00:21:59.217 "superblock": false, 00:21:59.217 "num_base_bdevs": 4, 00:21:59.217 "num_base_bdevs_discovered": 3, 00:21:59.217 "num_base_bdevs_operational": 4, 00:21:59.217 "base_bdevs_list": [ 00:21:59.217 { 00:21:59.217 "name": "BaseBdev1", 00:21:59.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.217 "is_configured": false, 00:21:59.217 "data_offset": 0, 00:21:59.217 "data_size": 0 00:21:59.217 }, 00:21:59.217 { 00:21:59.217 "name": "BaseBdev2", 00:21:59.217 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:21:59.217 "is_configured": true, 00:21:59.217 "data_offset": 0, 00:21:59.217 "data_size": 65536 00:21:59.217 }, 00:21:59.217 { 00:21:59.217 "name": "BaseBdev3", 00:21:59.217 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:21:59.217 "is_configured": true, 00:21:59.217 "data_offset": 0, 00:21:59.217 "data_size": 65536 00:21:59.217 }, 00:21:59.217 { 00:21:59.217 "name": "BaseBdev4", 00:21:59.217 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:21:59.217 "is_configured": true, 00:21:59.217 "data_offset": 0, 00:21:59.217 "data_size": 65536 00:21:59.217 } 00:21:59.217 ] 00:21:59.217 }' 00:21:59.217 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.217 22:29:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.783 22:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:00.041 [2024-07-12 22:29:10.208095] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.041 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.299 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.299 "name": "Existed_Raid", 00:22:00.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.299 "strip_size_kb": 0, 00:22:00.299 "state": "configuring", 00:22:00.299 "raid_level": "raid1", 00:22:00.300 "superblock": false, 00:22:00.300 "num_base_bdevs": 4, 00:22:00.300 "num_base_bdevs_discovered": 2, 00:22:00.300 "num_base_bdevs_operational": 4, 00:22:00.300 "base_bdevs_list": [ 00:22:00.300 { 00:22:00.300 "name": "BaseBdev1", 00:22:00.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.300 "is_configured": false, 00:22:00.300 "data_offset": 0, 00:22:00.300 "data_size": 0 00:22:00.300 }, 00:22:00.300 { 00:22:00.300 "name": null, 00:22:00.300 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:00.300 "is_configured": false, 00:22:00.300 "data_offset": 0, 00:22:00.300 "data_size": 65536 00:22:00.300 }, 00:22:00.300 { 00:22:00.300 "name": "BaseBdev3", 00:22:00.300 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:00.300 "is_configured": true, 00:22:00.300 "data_offset": 0, 00:22:00.300 "data_size": 65536 00:22:00.300 }, 00:22:00.300 { 00:22:00.300 "name": "BaseBdev4", 00:22:00.300 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:00.300 "is_configured": true, 00:22:00.300 "data_offset": 0, 00:22:00.300 "data_size": 65536 00:22:00.300 } 00:22:00.300 ] 00:22:00.300 }' 00:22:00.300 22:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.300 22:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.866 22:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.866 22:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:01.125 22:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:01.125 22:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:01.383 [2024-07-12 22:29:11.516176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:01.383 BaseBdev1 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:01.383 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:01.640 22:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:01.898 [ 00:22:01.898 { 00:22:01.898 "name": "BaseBdev1", 00:22:01.898 "aliases": [ 00:22:01.898 "91a070a3-6c53-4fdb-94eb-56353f611f20" 00:22:01.898 ], 00:22:01.898 "product_name": "Malloc disk", 00:22:01.898 "block_size": 512, 00:22:01.898 "num_blocks": 65536, 00:22:01.898 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:01.898 "assigned_rate_limits": { 00:22:01.898 "rw_ios_per_sec": 0, 00:22:01.898 "rw_mbytes_per_sec": 0, 00:22:01.898 "r_mbytes_per_sec": 0, 00:22:01.898 "w_mbytes_per_sec": 0 00:22:01.898 }, 00:22:01.898 "claimed": true, 00:22:01.898 "claim_type": "exclusive_write", 00:22:01.898 "zoned": false, 00:22:01.898 "supported_io_types": { 00:22:01.898 "read": true, 00:22:01.898 "write": true, 00:22:01.898 "unmap": true, 00:22:01.898 "flush": true, 00:22:01.898 "reset": true, 00:22:01.898 "nvme_admin": false, 00:22:01.898 "nvme_io": false, 00:22:01.898 "nvme_io_md": false, 00:22:01.898 "write_zeroes": true, 00:22:01.898 "zcopy": true, 00:22:01.898 "get_zone_info": false, 00:22:01.898 "zone_management": false, 00:22:01.898 "zone_append": false, 00:22:01.898 "compare": false, 00:22:01.898 "compare_and_write": false, 00:22:01.898 "abort": true, 00:22:01.898 "seek_hole": false, 00:22:01.898 "seek_data": false, 00:22:01.898 "copy": true, 00:22:01.898 "nvme_iov_md": false 00:22:01.898 }, 00:22:01.898 "memory_domains": [ 00:22:01.898 { 00:22:01.898 "dma_device_id": "system", 00:22:01.899 "dma_device_type": 1 00:22:01.899 }, 00:22:01.899 { 00:22:01.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.899 "dma_device_type": 2 00:22:01.899 } 00:22:01.899 ], 00:22:01.899 "driver_specific": {} 00:22:01.899 } 00:22:01.899 ] 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.899 "name": "Existed_Raid", 00:22:01.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.899 "strip_size_kb": 0, 00:22:01.899 "state": "configuring", 00:22:01.899 "raid_level": "raid1", 00:22:01.899 "superblock": false, 00:22:01.899 "num_base_bdevs": 4, 00:22:01.899 "num_base_bdevs_discovered": 3, 00:22:01.899 "num_base_bdevs_operational": 4, 00:22:01.899 "base_bdevs_list": [ 00:22:01.899 { 00:22:01.899 "name": "BaseBdev1", 00:22:01.899 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:01.899 "is_configured": true, 00:22:01.899 "data_offset": 0, 00:22:01.899 "data_size": 65536 00:22:01.899 }, 00:22:01.899 { 00:22:01.899 "name": null, 00:22:01.899 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:01.899 "is_configured": false, 00:22:01.899 "data_offset": 0, 00:22:01.899 "data_size": 65536 00:22:01.899 }, 00:22:01.899 { 00:22:01.899 "name": "BaseBdev3", 00:22:01.899 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:01.899 "is_configured": true, 00:22:01.899 "data_offset": 0, 00:22:01.899 "data_size": 65536 00:22:01.899 }, 00:22:01.899 { 00:22:01.899 "name": "BaseBdev4", 00:22:01.899 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:01.899 "is_configured": true, 00:22:01.899 "data_offset": 0, 00:22:01.899 "data_size": 65536 00:22:01.899 } 00:22:01.899 ] 00:22:01.899 }' 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.899 22:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.466 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.466 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:02.733 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:02.733 22:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:03.299 [2024-07-12 22:29:13.461377] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.299 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.556 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.556 "name": "Existed_Raid", 00:22:03.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.557 "strip_size_kb": 0, 00:22:03.557 "state": "configuring", 00:22:03.557 "raid_level": "raid1", 00:22:03.557 "superblock": false, 00:22:03.557 "num_base_bdevs": 4, 00:22:03.557 "num_base_bdevs_discovered": 2, 00:22:03.557 "num_base_bdevs_operational": 4, 00:22:03.557 "base_bdevs_list": [ 00:22:03.557 { 00:22:03.557 "name": "BaseBdev1", 00:22:03.557 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:03.557 "is_configured": true, 00:22:03.557 "data_offset": 0, 00:22:03.557 "data_size": 65536 00:22:03.557 }, 00:22:03.557 { 00:22:03.557 "name": null, 00:22:03.557 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:03.557 "is_configured": false, 00:22:03.557 "data_offset": 0, 00:22:03.557 "data_size": 65536 00:22:03.557 }, 00:22:03.557 { 00:22:03.557 "name": null, 00:22:03.557 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:03.557 "is_configured": false, 00:22:03.557 "data_offset": 0, 00:22:03.557 "data_size": 65536 00:22:03.557 }, 00:22:03.557 { 00:22:03.557 "name": "BaseBdev4", 00:22:03.557 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:03.557 "is_configured": true, 00:22:03.557 "data_offset": 0, 00:22:03.557 "data_size": 65536 00:22:03.557 } 00:22:03.557 ] 00:22:03.557 }' 00:22:03.557 22:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.557 22:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:04.123 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:04.123 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.382 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:04.382 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:04.640 [2024-07-12 22:29:14.801033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.640 22:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:04.899 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.899 "name": "Existed_Raid", 00:22:04.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.899 "strip_size_kb": 0, 00:22:04.899 "state": "configuring", 00:22:04.899 "raid_level": "raid1", 00:22:04.899 "superblock": false, 00:22:04.899 "num_base_bdevs": 4, 00:22:04.899 "num_base_bdevs_discovered": 3, 00:22:04.899 "num_base_bdevs_operational": 4, 00:22:04.899 "base_bdevs_list": [ 00:22:04.899 { 00:22:04.899 "name": "BaseBdev1", 00:22:04.899 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:04.899 "is_configured": true, 00:22:04.899 "data_offset": 0, 00:22:04.899 "data_size": 65536 00:22:04.899 }, 00:22:04.899 { 00:22:04.899 "name": null, 00:22:04.899 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:04.899 "is_configured": false, 00:22:04.899 "data_offset": 0, 00:22:04.899 "data_size": 65536 00:22:04.899 }, 00:22:04.899 { 00:22:04.899 "name": "BaseBdev3", 00:22:04.899 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:04.899 "is_configured": true, 00:22:04.899 "data_offset": 0, 00:22:04.899 "data_size": 65536 00:22:04.899 }, 00:22:04.899 { 00:22:04.899 "name": "BaseBdev4", 00:22:04.899 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:04.899 "is_configured": true, 00:22:04.899 "data_offset": 0, 00:22:04.899 "data_size": 65536 00:22:04.899 } 00:22:04.899 ] 00:22:04.899 }' 00:22:04.899 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.899 22:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.466 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.466 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:05.724 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:05.724 22:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:05.982 [2024-07-12 22:29:16.152631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.982 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:06.240 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.240 "name": "Existed_Raid", 00:22:06.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.240 "strip_size_kb": 0, 00:22:06.240 "state": "configuring", 00:22:06.240 "raid_level": "raid1", 00:22:06.240 "superblock": false, 00:22:06.240 "num_base_bdevs": 4, 00:22:06.240 "num_base_bdevs_discovered": 2, 00:22:06.240 "num_base_bdevs_operational": 4, 00:22:06.240 "base_bdevs_list": [ 00:22:06.240 { 00:22:06.240 "name": null, 00:22:06.240 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:06.240 "is_configured": false, 00:22:06.240 "data_offset": 0, 00:22:06.240 "data_size": 65536 00:22:06.240 }, 00:22:06.240 { 00:22:06.240 "name": null, 00:22:06.240 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:06.240 "is_configured": false, 00:22:06.240 "data_offset": 0, 00:22:06.240 "data_size": 65536 00:22:06.240 }, 00:22:06.240 { 00:22:06.240 "name": "BaseBdev3", 00:22:06.240 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:06.240 "is_configured": true, 00:22:06.240 "data_offset": 0, 00:22:06.240 "data_size": 65536 00:22:06.240 }, 00:22:06.240 { 00:22:06.240 "name": "BaseBdev4", 00:22:06.240 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:06.240 "is_configured": true, 00:22:06.240 "data_offset": 0, 00:22:06.240 "data_size": 65536 00:22:06.240 } 00:22:06.240 ] 00:22:06.240 }' 00:22:06.240 22:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.240 22:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.807 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.807 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:07.067 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:07.067 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:07.636 [2024-07-12 22:29:17.759650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.636 22:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:07.895 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.895 "name": "Existed_Raid", 00:22:07.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.895 "strip_size_kb": 0, 00:22:07.895 "state": "configuring", 00:22:07.895 "raid_level": "raid1", 00:22:07.895 "superblock": false, 00:22:07.895 "num_base_bdevs": 4, 00:22:07.895 "num_base_bdevs_discovered": 3, 00:22:07.895 "num_base_bdevs_operational": 4, 00:22:07.895 "base_bdevs_list": [ 00:22:07.895 { 00:22:07.895 "name": null, 00:22:07.895 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:07.895 "is_configured": false, 00:22:07.895 "data_offset": 0, 00:22:07.895 "data_size": 65536 00:22:07.895 }, 00:22:07.895 { 00:22:07.895 "name": "BaseBdev2", 00:22:07.895 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:07.895 "is_configured": true, 00:22:07.895 "data_offset": 0, 00:22:07.895 "data_size": 65536 00:22:07.895 }, 00:22:07.895 { 00:22:07.895 "name": "BaseBdev3", 00:22:07.895 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:07.895 "is_configured": true, 00:22:07.895 "data_offset": 0, 00:22:07.895 "data_size": 65536 00:22:07.895 }, 00:22:07.895 { 00:22:07.895 "name": "BaseBdev4", 00:22:07.895 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:07.895 "is_configured": true, 00:22:07.895 "data_offset": 0, 00:22:07.895 "data_size": 65536 00:22:07.895 } 00:22:07.895 ] 00:22:07.895 }' 00:22:07.895 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.895 22:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.463 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.463 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:08.722 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:08.722 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.722 22:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:08.983 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 91a070a3-6c53-4fdb-94eb-56353f611f20 00:22:09.243 [2024-07-12 22:29:19.348695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:09.243 [2024-07-12 22:29:19.348745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe28610 00:22:09.243 [2024-07-12 22:29:19.348754] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:09.243 [2024-07-12 22:29:19.348967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe29a70 00:22:09.243 [2024-07-12 22:29:19.349097] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe28610 00:22:09.243 [2024-07-12 22:29:19.349108] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe28610 00:22:09.243 [2024-07-12 22:29:19.349285] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.243 NewBaseBdev 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:09.243 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:09.502 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:09.502 [ 00:22:09.502 { 00:22:09.502 "name": "NewBaseBdev", 00:22:09.502 "aliases": [ 00:22:09.502 "91a070a3-6c53-4fdb-94eb-56353f611f20" 00:22:09.502 ], 00:22:09.502 "product_name": "Malloc disk", 00:22:09.502 "block_size": 512, 00:22:09.502 "num_blocks": 65536, 00:22:09.502 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:09.502 "assigned_rate_limits": { 00:22:09.502 "rw_ios_per_sec": 0, 00:22:09.502 "rw_mbytes_per_sec": 0, 00:22:09.502 "r_mbytes_per_sec": 0, 00:22:09.502 "w_mbytes_per_sec": 0 00:22:09.502 }, 00:22:09.502 "claimed": true, 00:22:09.502 "claim_type": "exclusive_write", 00:22:09.502 "zoned": false, 00:22:09.502 "supported_io_types": { 00:22:09.502 "read": true, 00:22:09.502 "write": true, 00:22:09.502 "unmap": true, 00:22:09.502 "flush": true, 00:22:09.502 "reset": true, 00:22:09.503 "nvme_admin": false, 00:22:09.503 "nvme_io": false, 00:22:09.503 "nvme_io_md": false, 00:22:09.503 "write_zeroes": true, 00:22:09.503 "zcopy": true, 00:22:09.503 "get_zone_info": false, 00:22:09.503 "zone_management": false, 00:22:09.503 "zone_append": false, 00:22:09.503 "compare": false, 00:22:09.503 "compare_and_write": false, 00:22:09.503 "abort": true, 00:22:09.503 "seek_hole": false, 00:22:09.503 "seek_data": false, 00:22:09.503 "copy": true, 00:22:09.503 "nvme_iov_md": false 00:22:09.503 }, 00:22:09.503 "memory_domains": [ 00:22:09.503 { 00:22:09.503 "dma_device_id": "system", 00:22:09.503 "dma_device_type": 1 00:22:09.503 }, 00:22:09.503 { 00:22:09.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:09.503 "dma_device_type": 2 00:22:09.503 } 00:22:09.503 ], 00:22:09.503 "driver_specific": {} 00:22:09.503 } 00:22:09.503 ] 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.762 22:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:09.762 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.762 "name": "Existed_Raid", 00:22:09.762 "uuid": "3efb50c5-ea07-48e4-9928-3abc3e0eb7a2", 00:22:09.762 "strip_size_kb": 0, 00:22:09.762 "state": "online", 00:22:09.762 "raid_level": "raid1", 00:22:09.762 "superblock": false, 00:22:09.762 "num_base_bdevs": 4, 00:22:09.762 "num_base_bdevs_discovered": 4, 00:22:09.762 "num_base_bdevs_operational": 4, 00:22:09.762 "base_bdevs_list": [ 00:22:09.762 { 00:22:09.762 "name": "NewBaseBdev", 00:22:09.762 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:09.762 "is_configured": true, 00:22:09.762 "data_offset": 0, 00:22:09.762 "data_size": 65536 00:22:09.762 }, 00:22:09.762 { 00:22:09.762 "name": "BaseBdev2", 00:22:09.762 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:09.762 "is_configured": true, 00:22:09.762 "data_offset": 0, 00:22:09.762 "data_size": 65536 00:22:09.762 }, 00:22:09.762 { 00:22:09.762 "name": "BaseBdev3", 00:22:09.762 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:09.762 "is_configured": true, 00:22:09.762 "data_offset": 0, 00:22:09.762 "data_size": 65536 00:22:09.762 }, 00:22:09.762 { 00:22:09.762 "name": "BaseBdev4", 00:22:09.762 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:09.762 "is_configured": true, 00:22:09.762 "data_offset": 0, 00:22:09.762 "data_size": 65536 00:22:09.762 } 00:22:09.762 ] 00:22:09.762 }' 00:22:09.762 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.022 22:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:10.590 [2024-07-12 22:29:20.893145] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.590 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:10.590 "name": "Existed_Raid", 00:22:10.590 "aliases": [ 00:22:10.590 "3efb50c5-ea07-48e4-9928-3abc3e0eb7a2" 00:22:10.590 ], 00:22:10.590 "product_name": "Raid Volume", 00:22:10.590 "block_size": 512, 00:22:10.590 "num_blocks": 65536, 00:22:10.590 "uuid": "3efb50c5-ea07-48e4-9928-3abc3e0eb7a2", 00:22:10.590 "assigned_rate_limits": { 00:22:10.590 "rw_ios_per_sec": 0, 00:22:10.590 "rw_mbytes_per_sec": 0, 00:22:10.590 "r_mbytes_per_sec": 0, 00:22:10.590 "w_mbytes_per_sec": 0 00:22:10.590 }, 00:22:10.590 "claimed": false, 00:22:10.590 "zoned": false, 00:22:10.590 "supported_io_types": { 00:22:10.590 "read": true, 00:22:10.590 "write": true, 00:22:10.590 "unmap": false, 00:22:10.590 "flush": false, 00:22:10.590 "reset": true, 00:22:10.590 "nvme_admin": false, 00:22:10.590 "nvme_io": false, 00:22:10.590 "nvme_io_md": false, 00:22:10.590 "write_zeroes": true, 00:22:10.590 "zcopy": false, 00:22:10.590 "get_zone_info": false, 00:22:10.590 "zone_management": false, 00:22:10.590 "zone_append": false, 00:22:10.590 "compare": false, 00:22:10.590 "compare_and_write": false, 00:22:10.590 "abort": false, 00:22:10.590 "seek_hole": false, 00:22:10.590 "seek_data": false, 00:22:10.590 "copy": false, 00:22:10.590 "nvme_iov_md": false 00:22:10.590 }, 00:22:10.590 "memory_domains": [ 00:22:10.590 { 00:22:10.590 "dma_device_id": "system", 00:22:10.590 "dma_device_type": 1 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.590 "dma_device_type": 2 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "system", 00:22:10.590 "dma_device_type": 1 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.590 "dma_device_type": 2 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "system", 00:22:10.590 "dma_device_type": 1 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.590 "dma_device_type": 2 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "system", 00:22:10.590 "dma_device_type": 1 00:22:10.590 }, 00:22:10.590 { 00:22:10.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.590 "dma_device_type": 2 00:22:10.590 } 00:22:10.590 ], 00:22:10.590 "driver_specific": { 00:22:10.590 "raid": { 00:22:10.590 "uuid": "3efb50c5-ea07-48e4-9928-3abc3e0eb7a2", 00:22:10.590 "strip_size_kb": 0, 00:22:10.590 "state": "online", 00:22:10.590 "raid_level": "raid1", 00:22:10.590 "superblock": false, 00:22:10.590 "num_base_bdevs": 4, 00:22:10.590 "num_base_bdevs_discovered": 4, 00:22:10.590 "num_base_bdevs_operational": 4, 00:22:10.590 "base_bdevs_list": [ 00:22:10.590 { 00:22:10.590 "name": "NewBaseBdev", 00:22:10.590 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:10.590 "is_configured": true, 00:22:10.590 "data_offset": 0, 00:22:10.591 "data_size": 65536 00:22:10.591 }, 00:22:10.591 { 00:22:10.591 "name": "BaseBdev2", 00:22:10.591 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:10.591 "is_configured": true, 00:22:10.591 "data_offset": 0, 00:22:10.591 "data_size": 65536 00:22:10.591 }, 00:22:10.591 { 00:22:10.591 "name": "BaseBdev3", 00:22:10.591 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:10.591 "is_configured": true, 00:22:10.591 "data_offset": 0, 00:22:10.591 "data_size": 65536 00:22:10.591 }, 00:22:10.591 { 00:22:10.591 "name": "BaseBdev4", 00:22:10.591 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:10.591 "is_configured": true, 00:22:10.591 "data_offset": 0, 00:22:10.591 "data_size": 65536 00:22:10.591 } 00:22:10.591 ] 00:22:10.591 } 00:22:10.591 } 00:22:10.591 }' 00:22:10.849 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:10.849 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:10.849 BaseBdev2 00:22:10.849 BaseBdev3 00:22:10.849 BaseBdev4' 00:22:10.849 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:10.849 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:10.849 22:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.109 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.109 "name": "NewBaseBdev", 00:22:11.109 "aliases": [ 00:22:11.109 "91a070a3-6c53-4fdb-94eb-56353f611f20" 00:22:11.109 ], 00:22:11.109 "product_name": "Malloc disk", 00:22:11.109 "block_size": 512, 00:22:11.109 "num_blocks": 65536, 00:22:11.109 "uuid": "91a070a3-6c53-4fdb-94eb-56353f611f20", 00:22:11.109 "assigned_rate_limits": { 00:22:11.109 "rw_ios_per_sec": 0, 00:22:11.109 "rw_mbytes_per_sec": 0, 00:22:11.109 "r_mbytes_per_sec": 0, 00:22:11.109 "w_mbytes_per_sec": 0 00:22:11.109 }, 00:22:11.109 "claimed": true, 00:22:11.109 "claim_type": "exclusive_write", 00:22:11.109 "zoned": false, 00:22:11.109 "supported_io_types": { 00:22:11.109 "read": true, 00:22:11.109 "write": true, 00:22:11.109 "unmap": true, 00:22:11.109 "flush": true, 00:22:11.109 "reset": true, 00:22:11.109 "nvme_admin": false, 00:22:11.109 "nvme_io": false, 00:22:11.109 "nvme_io_md": false, 00:22:11.109 "write_zeroes": true, 00:22:11.109 "zcopy": true, 00:22:11.109 "get_zone_info": false, 00:22:11.109 "zone_management": false, 00:22:11.109 "zone_append": false, 00:22:11.109 "compare": false, 00:22:11.109 "compare_and_write": false, 00:22:11.109 "abort": true, 00:22:11.109 "seek_hole": false, 00:22:11.109 "seek_data": false, 00:22:11.109 "copy": true, 00:22:11.109 "nvme_iov_md": false 00:22:11.109 }, 00:22:11.109 "memory_domains": [ 00:22:11.109 { 00:22:11.110 "dma_device_id": "system", 00:22:11.110 "dma_device_type": 1 00:22:11.110 }, 00:22:11.110 { 00:22:11.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.110 "dma_device_type": 2 00:22:11.110 } 00:22:11.110 ], 00:22:11.110 "driver_specific": {} 00:22:11.110 }' 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.110 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:11.369 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.627 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.628 "name": "BaseBdev2", 00:22:11.628 "aliases": [ 00:22:11.628 "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91" 00:22:11.628 ], 00:22:11.628 "product_name": "Malloc disk", 00:22:11.628 "block_size": 512, 00:22:11.628 "num_blocks": 65536, 00:22:11.628 "uuid": "fcc61ae7-bf3c-411f-9dff-8b3bf9c55f91", 00:22:11.628 "assigned_rate_limits": { 00:22:11.628 "rw_ios_per_sec": 0, 00:22:11.628 "rw_mbytes_per_sec": 0, 00:22:11.628 "r_mbytes_per_sec": 0, 00:22:11.628 "w_mbytes_per_sec": 0 00:22:11.628 }, 00:22:11.628 "claimed": true, 00:22:11.628 "claim_type": "exclusive_write", 00:22:11.628 "zoned": false, 00:22:11.628 "supported_io_types": { 00:22:11.628 "read": true, 00:22:11.628 "write": true, 00:22:11.628 "unmap": true, 00:22:11.628 "flush": true, 00:22:11.628 "reset": true, 00:22:11.628 "nvme_admin": false, 00:22:11.628 "nvme_io": false, 00:22:11.628 "nvme_io_md": false, 00:22:11.628 "write_zeroes": true, 00:22:11.628 "zcopy": true, 00:22:11.628 "get_zone_info": false, 00:22:11.628 "zone_management": false, 00:22:11.628 "zone_append": false, 00:22:11.628 "compare": false, 00:22:11.628 "compare_and_write": false, 00:22:11.628 "abort": true, 00:22:11.628 "seek_hole": false, 00:22:11.628 "seek_data": false, 00:22:11.628 "copy": true, 00:22:11.628 "nvme_iov_md": false 00:22:11.628 }, 00:22:11.628 "memory_domains": [ 00:22:11.628 { 00:22:11.628 "dma_device_id": "system", 00:22:11.628 "dma_device_type": 1 00:22:11.628 }, 00:22:11.628 { 00:22:11.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.628 "dma_device_type": 2 00:22:11.628 } 00:22:11.628 ], 00:22:11.628 "driver_specific": {} 00:22:11.628 }' 00:22:11.628 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.628 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.628 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.628 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.628 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.886 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.886 22:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:11.886 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.144 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.144 "name": "BaseBdev3", 00:22:12.144 "aliases": [ 00:22:12.144 "0aae71cb-7291-4a7f-9cef-a9fddf5702d8" 00:22:12.144 ], 00:22:12.144 "product_name": "Malloc disk", 00:22:12.144 "block_size": 512, 00:22:12.144 "num_blocks": 65536, 00:22:12.144 "uuid": "0aae71cb-7291-4a7f-9cef-a9fddf5702d8", 00:22:12.144 "assigned_rate_limits": { 00:22:12.144 "rw_ios_per_sec": 0, 00:22:12.144 "rw_mbytes_per_sec": 0, 00:22:12.144 "r_mbytes_per_sec": 0, 00:22:12.144 "w_mbytes_per_sec": 0 00:22:12.144 }, 00:22:12.144 "claimed": true, 00:22:12.144 "claim_type": "exclusive_write", 00:22:12.144 "zoned": false, 00:22:12.144 "supported_io_types": { 00:22:12.144 "read": true, 00:22:12.144 "write": true, 00:22:12.144 "unmap": true, 00:22:12.144 "flush": true, 00:22:12.144 "reset": true, 00:22:12.144 "nvme_admin": false, 00:22:12.144 "nvme_io": false, 00:22:12.144 "nvme_io_md": false, 00:22:12.144 "write_zeroes": true, 00:22:12.144 "zcopy": true, 00:22:12.144 "get_zone_info": false, 00:22:12.144 "zone_management": false, 00:22:12.144 "zone_append": false, 00:22:12.144 "compare": false, 00:22:12.144 "compare_and_write": false, 00:22:12.144 "abort": true, 00:22:12.144 "seek_hole": false, 00:22:12.144 "seek_data": false, 00:22:12.144 "copy": true, 00:22:12.144 "nvme_iov_md": false 00:22:12.144 }, 00:22:12.144 "memory_domains": [ 00:22:12.144 { 00:22:12.144 "dma_device_id": "system", 00:22:12.144 "dma_device_type": 1 00:22:12.144 }, 00:22:12.144 { 00:22:12.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.144 "dma_device_type": 2 00:22:12.144 } 00:22:12.144 ], 00:22:12.144 "driver_specific": {} 00:22:12.144 }' 00:22:12.144 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.144 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:12.403 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.662 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.662 "name": "BaseBdev4", 00:22:12.662 "aliases": [ 00:22:12.662 "d1f49418-9557-4273-a7fe-7a3b93edd6ba" 00:22:12.662 ], 00:22:12.662 "product_name": "Malloc disk", 00:22:12.662 "block_size": 512, 00:22:12.662 "num_blocks": 65536, 00:22:12.662 "uuid": "d1f49418-9557-4273-a7fe-7a3b93edd6ba", 00:22:12.662 "assigned_rate_limits": { 00:22:12.662 "rw_ios_per_sec": 0, 00:22:12.662 "rw_mbytes_per_sec": 0, 00:22:12.662 "r_mbytes_per_sec": 0, 00:22:12.662 "w_mbytes_per_sec": 0 00:22:12.662 }, 00:22:12.662 "claimed": true, 00:22:12.662 "claim_type": "exclusive_write", 00:22:12.662 "zoned": false, 00:22:12.662 "supported_io_types": { 00:22:12.662 "read": true, 00:22:12.662 "write": true, 00:22:12.662 "unmap": true, 00:22:12.662 "flush": true, 00:22:12.662 "reset": true, 00:22:12.662 "nvme_admin": false, 00:22:12.662 "nvme_io": false, 00:22:12.662 "nvme_io_md": false, 00:22:12.662 "write_zeroes": true, 00:22:12.662 "zcopy": true, 00:22:12.662 "get_zone_info": false, 00:22:12.662 "zone_management": false, 00:22:12.662 "zone_append": false, 00:22:12.662 "compare": false, 00:22:12.662 "compare_and_write": false, 00:22:12.662 "abort": true, 00:22:12.662 "seek_hole": false, 00:22:12.662 "seek_data": false, 00:22:12.662 "copy": true, 00:22:12.662 "nvme_iov_md": false 00:22:12.662 }, 00:22:12.662 "memory_domains": [ 00:22:12.662 { 00:22:12.662 "dma_device_id": "system", 00:22:12.662 "dma_device_type": 1 00:22:12.662 }, 00:22:12.662 { 00:22:12.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.662 "dma_device_type": 2 00:22:12.662 } 00:22:12.662 ], 00:22:12.662 "driver_specific": {} 00:22:12.662 }' 00:22:12.662 22:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.920 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.180 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.180 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.180 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:13.438 [2024-07-12 22:29:23.527807] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:13.438 [2024-07-12 22:29:23.527840] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:13.438 [2024-07-12 22:29:23.527903] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:13.438 [2024-07-12 22:29:23.528199] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:13.438 [2024-07-12 22:29:23.528212] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe28610 name Existed_Raid, state offline 00:22:13.438 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 3511340 00:22:13.438 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 3511340 ']' 00:22:13.438 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 3511340 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3511340 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3511340' 00:22:13.439 killing process with pid 3511340 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 3511340 00:22:13.439 [2024-07-12 22:29:23.594957] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:13.439 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 3511340 00:22:13.439 [2024-07-12 22:29:23.633105] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:13.698 00:22:13.698 real 0m33.046s 00:22:13.698 user 1m0.801s 00:22:13.698 sys 0m5.870s 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.698 ************************************ 00:22:13.698 END TEST raid_state_function_test 00:22:13.698 ************************************ 00:22:13.698 22:29:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:13.698 22:29:23 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:13.698 22:29:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:13.698 22:29:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:13.698 22:29:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:13.698 ************************************ 00:22:13.698 START TEST raid_state_function_test_sb 00:22:13.698 ************************************ 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=3516223 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3516223' 00:22:13.698 Process raid pid: 3516223 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 3516223 /var/tmp/spdk-raid.sock 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3516223 ']' 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:13.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:13.698 22:29:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:13.698 [2024-07-12 22:29:23.993496] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:22:13.698 [2024-07-12 22:29:23.993560] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:13.958 [2024-07-12 22:29:24.122708] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:13.958 [2024-07-12 22:29:24.228687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:14.218 [2024-07-12 22:29:24.292616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:14.218 [2024-07-12 22:29:24.292647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:14.787 22:29:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:14.787 22:29:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:14.787 22:29:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:15.081 [2024-07-12 22:29:25.147667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:15.081 [2024-07-12 22:29:25.147708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:15.081 [2024-07-12 22:29:25.147720] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:15.081 [2024-07-12 22:29:25.147732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:15.081 [2024-07-12 22:29:25.147741] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:15.081 [2024-07-12 22:29:25.147752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:15.081 [2024-07-12 22:29:25.147761] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:15.081 [2024-07-12 22:29:25.147772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.081 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.340 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.340 "name": "Existed_Raid", 00:22:15.340 "uuid": "9e4915fe-822e-43cb-b311-232ad62c071b", 00:22:15.340 "strip_size_kb": 0, 00:22:15.340 "state": "configuring", 00:22:15.340 "raid_level": "raid1", 00:22:15.340 "superblock": true, 00:22:15.340 "num_base_bdevs": 4, 00:22:15.340 "num_base_bdevs_discovered": 0, 00:22:15.340 "num_base_bdevs_operational": 4, 00:22:15.340 "base_bdevs_list": [ 00:22:15.340 { 00:22:15.340 "name": "BaseBdev1", 00:22:15.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.340 "is_configured": false, 00:22:15.340 "data_offset": 0, 00:22:15.340 "data_size": 0 00:22:15.340 }, 00:22:15.340 { 00:22:15.340 "name": "BaseBdev2", 00:22:15.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.340 "is_configured": false, 00:22:15.340 "data_offset": 0, 00:22:15.340 "data_size": 0 00:22:15.340 }, 00:22:15.340 { 00:22:15.340 "name": "BaseBdev3", 00:22:15.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.340 "is_configured": false, 00:22:15.340 "data_offset": 0, 00:22:15.340 "data_size": 0 00:22:15.340 }, 00:22:15.340 { 00:22:15.340 "name": "BaseBdev4", 00:22:15.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.340 "is_configured": false, 00:22:15.340 "data_offset": 0, 00:22:15.340 "data_size": 0 00:22:15.340 } 00:22:15.340 ] 00:22:15.340 }' 00:22:15.340 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.340 22:29:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.600 22:29:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:15.860 [2024-07-12 22:29:26.130146] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:15.860 [2024-07-12 22:29:26.130179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe3daa0 name Existed_Raid, state configuring 00:22:15.860 22:29:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:16.120 [2024-07-12 22:29:26.378822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:16.120 [2024-07-12 22:29:26.378851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:16.120 [2024-07-12 22:29:26.378861] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:16.120 [2024-07-12 22:29:26.378873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:16.120 [2024-07-12 22:29:26.378881] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:16.120 [2024-07-12 22:29:26.378893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:16.120 [2024-07-12 22:29:26.378901] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:16.120 [2024-07-12 22:29:26.378912] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:16.120 22:29:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:16.380 [2024-07-12 22:29:26.633368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:16.380 BaseBdev1 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:16.380 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:16.639 22:29:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:16.901 [ 00:22:16.901 { 00:22:16.901 "name": "BaseBdev1", 00:22:16.901 "aliases": [ 00:22:16.901 "d51b63d9-0200-4829-b2b8-d4fb340b6e5e" 00:22:16.901 ], 00:22:16.901 "product_name": "Malloc disk", 00:22:16.901 "block_size": 512, 00:22:16.901 "num_blocks": 65536, 00:22:16.901 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:16.901 "assigned_rate_limits": { 00:22:16.901 "rw_ios_per_sec": 0, 00:22:16.901 "rw_mbytes_per_sec": 0, 00:22:16.901 "r_mbytes_per_sec": 0, 00:22:16.901 "w_mbytes_per_sec": 0 00:22:16.901 }, 00:22:16.901 "claimed": true, 00:22:16.901 "claim_type": "exclusive_write", 00:22:16.901 "zoned": false, 00:22:16.901 "supported_io_types": { 00:22:16.901 "read": true, 00:22:16.901 "write": true, 00:22:16.901 "unmap": true, 00:22:16.901 "flush": true, 00:22:16.901 "reset": true, 00:22:16.901 "nvme_admin": false, 00:22:16.901 "nvme_io": false, 00:22:16.901 "nvme_io_md": false, 00:22:16.901 "write_zeroes": true, 00:22:16.901 "zcopy": true, 00:22:16.901 "get_zone_info": false, 00:22:16.901 "zone_management": false, 00:22:16.901 "zone_append": false, 00:22:16.901 "compare": false, 00:22:16.901 "compare_and_write": false, 00:22:16.901 "abort": true, 00:22:16.901 "seek_hole": false, 00:22:16.901 "seek_data": false, 00:22:16.901 "copy": true, 00:22:16.901 "nvme_iov_md": false 00:22:16.901 }, 00:22:16.901 "memory_domains": [ 00:22:16.901 { 00:22:16.901 "dma_device_id": "system", 00:22:16.901 "dma_device_type": 1 00:22:16.901 }, 00:22:16.901 { 00:22:16.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.901 "dma_device_type": 2 00:22:16.901 } 00:22:16.901 ], 00:22:16.901 "driver_specific": {} 00:22:16.901 } 00:22:16.901 ] 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:16.901 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.161 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.161 "name": "Existed_Raid", 00:22:17.161 "uuid": "61d3962f-4059-438d-a12e-39bc87e54f6e", 00:22:17.161 "strip_size_kb": 0, 00:22:17.161 "state": "configuring", 00:22:17.161 "raid_level": "raid1", 00:22:17.161 "superblock": true, 00:22:17.161 "num_base_bdevs": 4, 00:22:17.161 "num_base_bdevs_discovered": 1, 00:22:17.161 "num_base_bdevs_operational": 4, 00:22:17.161 "base_bdevs_list": [ 00:22:17.161 { 00:22:17.161 "name": "BaseBdev1", 00:22:17.161 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:17.161 "is_configured": true, 00:22:17.161 "data_offset": 2048, 00:22:17.161 "data_size": 63488 00:22:17.161 }, 00:22:17.161 { 00:22:17.161 "name": "BaseBdev2", 00:22:17.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.161 "is_configured": false, 00:22:17.161 "data_offset": 0, 00:22:17.161 "data_size": 0 00:22:17.161 }, 00:22:17.161 { 00:22:17.161 "name": "BaseBdev3", 00:22:17.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.161 "is_configured": false, 00:22:17.161 "data_offset": 0, 00:22:17.161 "data_size": 0 00:22:17.161 }, 00:22:17.161 { 00:22:17.161 "name": "BaseBdev4", 00:22:17.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.161 "is_configured": false, 00:22:17.161 "data_offset": 0, 00:22:17.161 "data_size": 0 00:22:17.161 } 00:22:17.161 ] 00:22:17.161 }' 00:22:17.161 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.161 22:29:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.730 22:29:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:17.989 [2024-07-12 22:29:28.209558] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:17.989 [2024-07-12 22:29:28.209598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe3d310 name Existed_Raid, state configuring 00:22:17.989 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:18.248 [2024-07-12 22:29:28.450241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:18.248 [2024-07-12 22:29:28.451671] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:18.248 [2024-07-12 22:29:28.451703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:18.248 [2024-07-12 22:29:28.451714] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:18.248 [2024-07-12 22:29:28.451726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:18.248 [2024-07-12 22:29:28.451735] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:18.248 [2024-07-12 22:29:28.451746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.248 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.249 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:18.508 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.508 "name": "Existed_Raid", 00:22:18.508 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:18.508 "strip_size_kb": 0, 00:22:18.508 "state": "configuring", 00:22:18.508 "raid_level": "raid1", 00:22:18.508 "superblock": true, 00:22:18.508 "num_base_bdevs": 4, 00:22:18.508 "num_base_bdevs_discovered": 1, 00:22:18.508 "num_base_bdevs_operational": 4, 00:22:18.508 "base_bdevs_list": [ 00:22:18.508 { 00:22:18.508 "name": "BaseBdev1", 00:22:18.508 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:18.508 "is_configured": true, 00:22:18.508 "data_offset": 2048, 00:22:18.508 "data_size": 63488 00:22:18.508 }, 00:22:18.508 { 00:22:18.508 "name": "BaseBdev2", 00:22:18.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.508 "is_configured": false, 00:22:18.508 "data_offset": 0, 00:22:18.508 "data_size": 0 00:22:18.508 }, 00:22:18.508 { 00:22:18.508 "name": "BaseBdev3", 00:22:18.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.508 "is_configured": false, 00:22:18.508 "data_offset": 0, 00:22:18.508 "data_size": 0 00:22:18.508 }, 00:22:18.508 { 00:22:18.508 "name": "BaseBdev4", 00:22:18.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.508 "is_configured": false, 00:22:18.508 "data_offset": 0, 00:22:18.508 "data_size": 0 00:22:18.508 } 00:22:18.508 ] 00:22:18.508 }' 00:22:18.508 22:29:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.508 22:29:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:19.076 22:29:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:19.334 [2024-07-12 22:29:29.520418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:19.334 BaseBdev2 00:22:19.334 22:29:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:19.334 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:19.334 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:19.335 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:19.335 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:19.335 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:19.335 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:19.594 22:29:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:19.853 [ 00:22:19.853 { 00:22:19.853 "name": "BaseBdev2", 00:22:19.853 "aliases": [ 00:22:19.853 "edb935f9-8958-4870-af59-8353e524f65f" 00:22:19.853 ], 00:22:19.853 "product_name": "Malloc disk", 00:22:19.853 "block_size": 512, 00:22:19.853 "num_blocks": 65536, 00:22:19.853 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:19.853 "assigned_rate_limits": { 00:22:19.853 "rw_ios_per_sec": 0, 00:22:19.853 "rw_mbytes_per_sec": 0, 00:22:19.853 "r_mbytes_per_sec": 0, 00:22:19.853 "w_mbytes_per_sec": 0 00:22:19.853 }, 00:22:19.853 "claimed": true, 00:22:19.853 "claim_type": "exclusive_write", 00:22:19.853 "zoned": false, 00:22:19.853 "supported_io_types": { 00:22:19.853 "read": true, 00:22:19.853 "write": true, 00:22:19.853 "unmap": true, 00:22:19.853 "flush": true, 00:22:19.853 "reset": true, 00:22:19.853 "nvme_admin": false, 00:22:19.853 "nvme_io": false, 00:22:19.853 "nvme_io_md": false, 00:22:19.853 "write_zeroes": true, 00:22:19.853 "zcopy": true, 00:22:19.853 "get_zone_info": false, 00:22:19.853 "zone_management": false, 00:22:19.853 "zone_append": false, 00:22:19.853 "compare": false, 00:22:19.853 "compare_and_write": false, 00:22:19.853 "abort": true, 00:22:19.853 "seek_hole": false, 00:22:19.853 "seek_data": false, 00:22:19.853 "copy": true, 00:22:19.853 "nvme_iov_md": false 00:22:19.853 }, 00:22:19.853 "memory_domains": [ 00:22:19.853 { 00:22:19.853 "dma_device_id": "system", 00:22:19.853 "dma_device_type": 1 00:22:19.853 }, 00:22:19.853 { 00:22:19.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:19.853 "dma_device_type": 2 00:22:19.853 } 00:22:19.853 ], 00:22:19.853 "driver_specific": {} 00:22:19.853 } 00:22:19.853 ] 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.853 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.112 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.112 "name": "Existed_Raid", 00:22:20.112 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:20.112 "strip_size_kb": 0, 00:22:20.112 "state": "configuring", 00:22:20.112 "raid_level": "raid1", 00:22:20.112 "superblock": true, 00:22:20.112 "num_base_bdevs": 4, 00:22:20.112 "num_base_bdevs_discovered": 2, 00:22:20.112 "num_base_bdevs_operational": 4, 00:22:20.112 "base_bdevs_list": [ 00:22:20.112 { 00:22:20.112 "name": "BaseBdev1", 00:22:20.112 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:20.112 "is_configured": true, 00:22:20.112 "data_offset": 2048, 00:22:20.112 "data_size": 63488 00:22:20.112 }, 00:22:20.112 { 00:22:20.112 "name": "BaseBdev2", 00:22:20.112 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:20.112 "is_configured": true, 00:22:20.112 "data_offset": 2048, 00:22:20.112 "data_size": 63488 00:22:20.112 }, 00:22:20.112 { 00:22:20.112 "name": "BaseBdev3", 00:22:20.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.112 "is_configured": false, 00:22:20.112 "data_offset": 0, 00:22:20.112 "data_size": 0 00:22:20.112 }, 00:22:20.112 { 00:22:20.112 "name": "BaseBdev4", 00:22:20.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.112 "is_configured": false, 00:22:20.112 "data_offset": 0, 00:22:20.112 "data_size": 0 00:22:20.112 } 00:22:20.112 ] 00:22:20.112 }' 00:22:20.112 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.112 22:29:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.680 22:29:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:20.939 [2024-07-12 22:29:31.124159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:20.939 BaseBdev3 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:20.939 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:21.199 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:21.459 [ 00:22:21.459 { 00:22:21.459 "name": "BaseBdev3", 00:22:21.459 "aliases": [ 00:22:21.459 "817ad8ca-8589-472d-b293-2a755d01e1b5" 00:22:21.459 ], 00:22:21.459 "product_name": "Malloc disk", 00:22:21.459 "block_size": 512, 00:22:21.459 "num_blocks": 65536, 00:22:21.459 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:21.459 "assigned_rate_limits": { 00:22:21.459 "rw_ios_per_sec": 0, 00:22:21.459 "rw_mbytes_per_sec": 0, 00:22:21.459 "r_mbytes_per_sec": 0, 00:22:21.459 "w_mbytes_per_sec": 0 00:22:21.459 }, 00:22:21.459 "claimed": true, 00:22:21.459 "claim_type": "exclusive_write", 00:22:21.459 "zoned": false, 00:22:21.459 "supported_io_types": { 00:22:21.459 "read": true, 00:22:21.459 "write": true, 00:22:21.459 "unmap": true, 00:22:21.459 "flush": true, 00:22:21.459 "reset": true, 00:22:21.459 "nvme_admin": false, 00:22:21.459 "nvme_io": false, 00:22:21.459 "nvme_io_md": false, 00:22:21.459 "write_zeroes": true, 00:22:21.459 "zcopy": true, 00:22:21.459 "get_zone_info": false, 00:22:21.459 "zone_management": false, 00:22:21.459 "zone_append": false, 00:22:21.459 "compare": false, 00:22:21.459 "compare_and_write": false, 00:22:21.459 "abort": true, 00:22:21.459 "seek_hole": false, 00:22:21.459 "seek_data": false, 00:22:21.459 "copy": true, 00:22:21.459 "nvme_iov_md": false 00:22:21.459 }, 00:22:21.459 "memory_domains": [ 00:22:21.459 { 00:22:21.459 "dma_device_id": "system", 00:22:21.459 "dma_device_type": 1 00:22:21.459 }, 00:22:21.459 { 00:22:21.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:21.459 "dma_device_type": 2 00:22:21.459 } 00:22:21.459 ], 00:22:21.459 "driver_specific": {} 00:22:21.459 } 00:22:21.459 ] 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.459 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.719 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.719 "name": "Existed_Raid", 00:22:21.719 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:21.719 "strip_size_kb": 0, 00:22:21.719 "state": "configuring", 00:22:21.719 "raid_level": "raid1", 00:22:21.719 "superblock": true, 00:22:21.719 "num_base_bdevs": 4, 00:22:21.719 "num_base_bdevs_discovered": 3, 00:22:21.719 "num_base_bdevs_operational": 4, 00:22:21.719 "base_bdevs_list": [ 00:22:21.719 { 00:22:21.719 "name": "BaseBdev1", 00:22:21.719 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:21.719 "is_configured": true, 00:22:21.719 "data_offset": 2048, 00:22:21.719 "data_size": 63488 00:22:21.719 }, 00:22:21.719 { 00:22:21.719 "name": "BaseBdev2", 00:22:21.719 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:21.719 "is_configured": true, 00:22:21.719 "data_offset": 2048, 00:22:21.719 "data_size": 63488 00:22:21.719 }, 00:22:21.719 { 00:22:21.719 "name": "BaseBdev3", 00:22:21.719 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:21.719 "is_configured": true, 00:22:21.719 "data_offset": 2048, 00:22:21.719 "data_size": 63488 00:22:21.719 }, 00:22:21.719 { 00:22:21.719 "name": "BaseBdev4", 00:22:21.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.719 "is_configured": false, 00:22:21.719 "data_offset": 0, 00:22:21.719 "data_size": 0 00:22:21.719 } 00:22:21.719 ] 00:22:21.719 }' 00:22:21.719 22:29:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.719 22:29:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:22.287 [2024-07-12 22:29:32.571404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:22.287 [2024-07-12 22:29:32.571575] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe3e350 00:22:22.287 [2024-07-12 22:29:32.571588] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:22.287 [2024-07-12 22:29:32.571770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe3e020 00:22:22.287 [2024-07-12 22:29:32.571895] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe3e350 00:22:22.287 [2024-07-12 22:29:32.571905] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe3e350 00:22:22.287 [2024-07-12 22:29:32.572010] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.287 BaseBdev4 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:22.287 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:22.548 22:29:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:22.807 [ 00:22:22.807 { 00:22:22.807 "name": "BaseBdev4", 00:22:22.807 "aliases": [ 00:22:22.807 "dd84068d-387b-4d21-91dc-ce5348374b69" 00:22:22.807 ], 00:22:22.807 "product_name": "Malloc disk", 00:22:22.807 "block_size": 512, 00:22:22.807 "num_blocks": 65536, 00:22:22.807 "uuid": "dd84068d-387b-4d21-91dc-ce5348374b69", 00:22:22.807 "assigned_rate_limits": { 00:22:22.807 "rw_ios_per_sec": 0, 00:22:22.807 "rw_mbytes_per_sec": 0, 00:22:22.807 "r_mbytes_per_sec": 0, 00:22:22.807 "w_mbytes_per_sec": 0 00:22:22.807 }, 00:22:22.807 "claimed": true, 00:22:22.807 "claim_type": "exclusive_write", 00:22:22.807 "zoned": false, 00:22:22.807 "supported_io_types": { 00:22:22.807 "read": true, 00:22:22.807 "write": true, 00:22:22.807 "unmap": true, 00:22:22.807 "flush": true, 00:22:22.807 "reset": true, 00:22:22.807 "nvme_admin": false, 00:22:22.807 "nvme_io": false, 00:22:22.807 "nvme_io_md": false, 00:22:22.807 "write_zeroes": true, 00:22:22.807 "zcopy": true, 00:22:22.807 "get_zone_info": false, 00:22:22.807 "zone_management": false, 00:22:22.807 "zone_append": false, 00:22:22.807 "compare": false, 00:22:22.807 "compare_and_write": false, 00:22:22.807 "abort": true, 00:22:22.807 "seek_hole": false, 00:22:22.807 "seek_data": false, 00:22:22.807 "copy": true, 00:22:22.807 "nvme_iov_md": false 00:22:22.807 }, 00:22:22.807 "memory_domains": [ 00:22:22.807 { 00:22:22.807 "dma_device_id": "system", 00:22:22.807 "dma_device_type": 1 00:22:22.807 }, 00:22:22.807 { 00:22:22.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.807 "dma_device_type": 2 00:22:22.807 } 00:22:22.807 ], 00:22:22.807 "driver_specific": {} 00:22:22.807 } 00:22:22.807 ] 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:22.807 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.067 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.067 "name": "Existed_Raid", 00:22:23.067 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:23.067 "strip_size_kb": 0, 00:22:23.067 "state": "online", 00:22:23.067 "raid_level": "raid1", 00:22:23.067 "superblock": true, 00:22:23.067 "num_base_bdevs": 4, 00:22:23.067 "num_base_bdevs_discovered": 4, 00:22:23.067 "num_base_bdevs_operational": 4, 00:22:23.067 "base_bdevs_list": [ 00:22:23.067 { 00:22:23.067 "name": "BaseBdev1", 00:22:23.067 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:23.067 "is_configured": true, 00:22:23.067 "data_offset": 2048, 00:22:23.067 "data_size": 63488 00:22:23.067 }, 00:22:23.067 { 00:22:23.067 "name": "BaseBdev2", 00:22:23.067 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:23.067 "is_configured": true, 00:22:23.067 "data_offset": 2048, 00:22:23.067 "data_size": 63488 00:22:23.067 }, 00:22:23.067 { 00:22:23.067 "name": "BaseBdev3", 00:22:23.067 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:23.067 "is_configured": true, 00:22:23.067 "data_offset": 2048, 00:22:23.067 "data_size": 63488 00:22:23.067 }, 00:22:23.067 { 00:22:23.067 "name": "BaseBdev4", 00:22:23.067 "uuid": "dd84068d-387b-4d21-91dc-ce5348374b69", 00:22:23.067 "is_configured": true, 00:22:23.067 "data_offset": 2048, 00:22:23.067 "data_size": 63488 00:22:23.067 } 00:22:23.067 ] 00:22:23.067 }' 00:22:23.067 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.067 22:29:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:23.635 22:29:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:23.894 [2024-07-12 22:29:34.159961] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.894 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:23.894 "name": "Existed_Raid", 00:22:23.894 "aliases": [ 00:22:23.894 "ae8be018-ee75-4396-a470-f2c4c7be99d0" 00:22:23.894 ], 00:22:23.894 "product_name": "Raid Volume", 00:22:23.894 "block_size": 512, 00:22:23.894 "num_blocks": 63488, 00:22:23.894 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:23.894 "assigned_rate_limits": { 00:22:23.894 "rw_ios_per_sec": 0, 00:22:23.894 "rw_mbytes_per_sec": 0, 00:22:23.894 "r_mbytes_per_sec": 0, 00:22:23.894 "w_mbytes_per_sec": 0 00:22:23.894 }, 00:22:23.894 "claimed": false, 00:22:23.894 "zoned": false, 00:22:23.894 "supported_io_types": { 00:22:23.894 "read": true, 00:22:23.894 "write": true, 00:22:23.894 "unmap": false, 00:22:23.894 "flush": false, 00:22:23.894 "reset": true, 00:22:23.894 "nvme_admin": false, 00:22:23.894 "nvme_io": false, 00:22:23.894 "nvme_io_md": false, 00:22:23.894 "write_zeroes": true, 00:22:23.894 "zcopy": false, 00:22:23.894 "get_zone_info": false, 00:22:23.894 "zone_management": false, 00:22:23.894 "zone_append": false, 00:22:23.894 "compare": false, 00:22:23.894 "compare_and_write": false, 00:22:23.894 "abort": false, 00:22:23.894 "seek_hole": false, 00:22:23.894 "seek_data": false, 00:22:23.894 "copy": false, 00:22:23.894 "nvme_iov_md": false 00:22:23.894 }, 00:22:23.894 "memory_domains": [ 00:22:23.894 { 00:22:23.894 "dma_device_id": "system", 00:22:23.894 "dma_device_type": 1 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.894 "dma_device_type": 2 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "system", 00:22:23.894 "dma_device_type": 1 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.894 "dma_device_type": 2 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "system", 00:22:23.894 "dma_device_type": 1 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.894 "dma_device_type": 2 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "system", 00:22:23.894 "dma_device_type": 1 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.894 "dma_device_type": 2 00:22:23.894 } 00:22:23.894 ], 00:22:23.894 "driver_specific": { 00:22:23.894 "raid": { 00:22:23.894 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:23.894 "strip_size_kb": 0, 00:22:23.894 "state": "online", 00:22:23.894 "raid_level": "raid1", 00:22:23.894 "superblock": true, 00:22:23.894 "num_base_bdevs": 4, 00:22:23.894 "num_base_bdevs_discovered": 4, 00:22:23.894 "num_base_bdevs_operational": 4, 00:22:23.894 "base_bdevs_list": [ 00:22:23.894 { 00:22:23.894 "name": "BaseBdev1", 00:22:23.894 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:23.894 "is_configured": true, 00:22:23.894 "data_offset": 2048, 00:22:23.894 "data_size": 63488 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "name": "BaseBdev2", 00:22:23.894 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:23.894 "is_configured": true, 00:22:23.894 "data_offset": 2048, 00:22:23.894 "data_size": 63488 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "name": "BaseBdev3", 00:22:23.894 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:23.894 "is_configured": true, 00:22:23.894 "data_offset": 2048, 00:22:23.894 "data_size": 63488 00:22:23.894 }, 00:22:23.894 { 00:22:23.894 "name": "BaseBdev4", 00:22:23.894 "uuid": "dd84068d-387b-4d21-91dc-ce5348374b69", 00:22:23.894 "is_configured": true, 00:22:23.894 "data_offset": 2048, 00:22:23.894 "data_size": 63488 00:22:23.894 } 00:22:23.894 ] 00:22:23.894 } 00:22:23.894 } 00:22:23.894 }' 00:22:23.894 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:24.154 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:24.154 BaseBdev2 00:22:24.154 BaseBdev3 00:22:24.154 BaseBdev4' 00:22:24.154 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.154 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:24.154 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.414 "name": "BaseBdev1", 00:22:24.414 "aliases": [ 00:22:24.414 "d51b63d9-0200-4829-b2b8-d4fb340b6e5e" 00:22:24.414 ], 00:22:24.414 "product_name": "Malloc disk", 00:22:24.414 "block_size": 512, 00:22:24.414 "num_blocks": 65536, 00:22:24.414 "uuid": "d51b63d9-0200-4829-b2b8-d4fb340b6e5e", 00:22:24.414 "assigned_rate_limits": { 00:22:24.414 "rw_ios_per_sec": 0, 00:22:24.414 "rw_mbytes_per_sec": 0, 00:22:24.414 "r_mbytes_per_sec": 0, 00:22:24.414 "w_mbytes_per_sec": 0 00:22:24.414 }, 00:22:24.414 "claimed": true, 00:22:24.414 "claim_type": "exclusive_write", 00:22:24.414 "zoned": false, 00:22:24.414 "supported_io_types": { 00:22:24.414 "read": true, 00:22:24.414 "write": true, 00:22:24.414 "unmap": true, 00:22:24.414 "flush": true, 00:22:24.414 "reset": true, 00:22:24.414 "nvme_admin": false, 00:22:24.414 "nvme_io": false, 00:22:24.414 "nvme_io_md": false, 00:22:24.414 "write_zeroes": true, 00:22:24.414 "zcopy": true, 00:22:24.414 "get_zone_info": false, 00:22:24.414 "zone_management": false, 00:22:24.414 "zone_append": false, 00:22:24.414 "compare": false, 00:22:24.414 "compare_and_write": false, 00:22:24.414 "abort": true, 00:22:24.414 "seek_hole": false, 00:22:24.414 "seek_data": false, 00:22:24.414 "copy": true, 00:22:24.414 "nvme_iov_md": false 00:22:24.414 }, 00:22:24.414 "memory_domains": [ 00:22:24.414 { 00:22:24.414 "dma_device_id": "system", 00:22:24.414 "dma_device_type": 1 00:22:24.414 }, 00:22:24.414 { 00:22:24.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.414 "dma_device_type": 2 00:22:24.414 } 00:22:24.414 ], 00:22:24.414 "driver_specific": {} 00:22:24.414 }' 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.414 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:24.673 22:29:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.932 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.932 "name": "BaseBdev2", 00:22:24.932 "aliases": [ 00:22:24.932 "edb935f9-8958-4870-af59-8353e524f65f" 00:22:24.932 ], 00:22:24.932 "product_name": "Malloc disk", 00:22:24.932 "block_size": 512, 00:22:24.932 "num_blocks": 65536, 00:22:24.932 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:24.932 "assigned_rate_limits": { 00:22:24.932 "rw_ios_per_sec": 0, 00:22:24.932 "rw_mbytes_per_sec": 0, 00:22:24.932 "r_mbytes_per_sec": 0, 00:22:24.932 "w_mbytes_per_sec": 0 00:22:24.932 }, 00:22:24.932 "claimed": true, 00:22:24.932 "claim_type": "exclusive_write", 00:22:24.933 "zoned": false, 00:22:24.933 "supported_io_types": { 00:22:24.933 "read": true, 00:22:24.933 "write": true, 00:22:24.933 "unmap": true, 00:22:24.933 "flush": true, 00:22:24.933 "reset": true, 00:22:24.933 "nvme_admin": false, 00:22:24.933 "nvme_io": false, 00:22:24.933 "nvme_io_md": false, 00:22:24.933 "write_zeroes": true, 00:22:24.933 "zcopy": true, 00:22:24.933 "get_zone_info": false, 00:22:24.933 "zone_management": false, 00:22:24.933 "zone_append": false, 00:22:24.933 "compare": false, 00:22:24.933 "compare_and_write": false, 00:22:24.933 "abort": true, 00:22:24.933 "seek_hole": false, 00:22:24.933 "seek_data": false, 00:22:24.933 "copy": true, 00:22:24.933 "nvme_iov_md": false 00:22:24.933 }, 00:22:24.933 "memory_domains": [ 00:22:24.933 { 00:22:24.933 "dma_device_id": "system", 00:22:24.933 "dma_device_type": 1 00:22:24.933 }, 00:22:24.933 { 00:22:24.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.933 "dma_device_type": 2 00:22:24.933 } 00:22:24.933 ], 00:22:24.933 "driver_specific": {} 00:22:24.933 }' 00:22:24.933 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.933 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.933 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.933 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.933 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.192 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:25.452 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.452 "name": "BaseBdev3", 00:22:25.452 "aliases": [ 00:22:25.452 "817ad8ca-8589-472d-b293-2a755d01e1b5" 00:22:25.452 ], 00:22:25.452 "product_name": "Malloc disk", 00:22:25.452 "block_size": 512, 00:22:25.452 "num_blocks": 65536, 00:22:25.452 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:25.452 "assigned_rate_limits": { 00:22:25.452 "rw_ios_per_sec": 0, 00:22:25.452 "rw_mbytes_per_sec": 0, 00:22:25.452 "r_mbytes_per_sec": 0, 00:22:25.452 "w_mbytes_per_sec": 0 00:22:25.452 }, 00:22:25.452 "claimed": true, 00:22:25.452 "claim_type": "exclusive_write", 00:22:25.452 "zoned": false, 00:22:25.452 "supported_io_types": { 00:22:25.452 "read": true, 00:22:25.452 "write": true, 00:22:25.452 "unmap": true, 00:22:25.452 "flush": true, 00:22:25.452 "reset": true, 00:22:25.452 "nvme_admin": false, 00:22:25.452 "nvme_io": false, 00:22:25.452 "nvme_io_md": false, 00:22:25.452 "write_zeroes": true, 00:22:25.452 "zcopy": true, 00:22:25.452 "get_zone_info": false, 00:22:25.452 "zone_management": false, 00:22:25.452 "zone_append": false, 00:22:25.452 "compare": false, 00:22:25.452 "compare_and_write": false, 00:22:25.452 "abort": true, 00:22:25.452 "seek_hole": false, 00:22:25.452 "seek_data": false, 00:22:25.452 "copy": true, 00:22:25.452 "nvme_iov_md": false 00:22:25.452 }, 00:22:25.452 "memory_domains": [ 00:22:25.452 { 00:22:25.452 "dma_device_id": "system", 00:22:25.452 "dma_device_type": 1 00:22:25.452 }, 00:22:25.452 { 00:22:25.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.452 "dma_device_type": 2 00:22:25.452 } 00:22:25.452 ], 00:22:25.452 "driver_specific": {} 00:22:25.452 }' 00:22:25.452 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.452 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.712 22:29:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.712 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.971 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.971 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.971 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:25.971 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.230 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.230 "name": "BaseBdev4", 00:22:26.230 "aliases": [ 00:22:26.230 "dd84068d-387b-4d21-91dc-ce5348374b69" 00:22:26.230 ], 00:22:26.230 "product_name": "Malloc disk", 00:22:26.230 "block_size": 512, 00:22:26.230 "num_blocks": 65536, 00:22:26.230 "uuid": "dd84068d-387b-4d21-91dc-ce5348374b69", 00:22:26.230 "assigned_rate_limits": { 00:22:26.230 "rw_ios_per_sec": 0, 00:22:26.230 "rw_mbytes_per_sec": 0, 00:22:26.230 "r_mbytes_per_sec": 0, 00:22:26.230 "w_mbytes_per_sec": 0 00:22:26.230 }, 00:22:26.230 "claimed": true, 00:22:26.230 "claim_type": "exclusive_write", 00:22:26.230 "zoned": false, 00:22:26.230 "supported_io_types": { 00:22:26.230 "read": true, 00:22:26.230 "write": true, 00:22:26.230 "unmap": true, 00:22:26.230 "flush": true, 00:22:26.230 "reset": true, 00:22:26.230 "nvme_admin": false, 00:22:26.230 "nvme_io": false, 00:22:26.230 "nvme_io_md": false, 00:22:26.230 "write_zeroes": true, 00:22:26.230 "zcopy": true, 00:22:26.230 "get_zone_info": false, 00:22:26.230 "zone_management": false, 00:22:26.230 "zone_append": false, 00:22:26.230 "compare": false, 00:22:26.230 "compare_and_write": false, 00:22:26.231 "abort": true, 00:22:26.231 "seek_hole": false, 00:22:26.231 "seek_data": false, 00:22:26.231 "copy": true, 00:22:26.231 "nvme_iov_md": false 00:22:26.231 }, 00:22:26.231 "memory_domains": [ 00:22:26.231 { 00:22:26.231 "dma_device_id": "system", 00:22:26.231 "dma_device_type": 1 00:22:26.231 }, 00:22:26.231 { 00:22:26.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.231 "dma_device_type": 2 00:22:26.231 } 00:22:26.231 ], 00:22:26.231 "driver_specific": {} 00:22:26.231 }' 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.231 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.490 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.490 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.490 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.490 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.490 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:26.750 [2024-07-12 22:29:36.902982] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.750 22:29:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:27.009 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.009 "name": "Existed_Raid", 00:22:27.009 "uuid": "ae8be018-ee75-4396-a470-f2c4c7be99d0", 00:22:27.009 "strip_size_kb": 0, 00:22:27.009 "state": "online", 00:22:27.009 "raid_level": "raid1", 00:22:27.009 "superblock": true, 00:22:27.009 "num_base_bdevs": 4, 00:22:27.009 "num_base_bdevs_discovered": 3, 00:22:27.009 "num_base_bdevs_operational": 3, 00:22:27.009 "base_bdevs_list": [ 00:22:27.009 { 00:22:27.009 "name": null, 00:22:27.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.009 "is_configured": false, 00:22:27.009 "data_offset": 2048, 00:22:27.009 "data_size": 63488 00:22:27.009 }, 00:22:27.009 { 00:22:27.009 "name": "BaseBdev2", 00:22:27.009 "uuid": "edb935f9-8958-4870-af59-8353e524f65f", 00:22:27.009 "is_configured": true, 00:22:27.009 "data_offset": 2048, 00:22:27.009 "data_size": 63488 00:22:27.009 }, 00:22:27.009 { 00:22:27.009 "name": "BaseBdev3", 00:22:27.009 "uuid": "817ad8ca-8589-472d-b293-2a755d01e1b5", 00:22:27.009 "is_configured": true, 00:22:27.009 "data_offset": 2048, 00:22:27.009 "data_size": 63488 00:22:27.009 }, 00:22:27.009 { 00:22:27.009 "name": "BaseBdev4", 00:22:27.009 "uuid": "dd84068d-387b-4d21-91dc-ce5348374b69", 00:22:27.009 "is_configured": true, 00:22:27.009 "data_offset": 2048, 00:22:27.009 "data_size": 63488 00:22:27.009 } 00:22:27.009 ] 00:22:27.009 }' 00:22:27.009 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.009 22:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:27.577 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:27.577 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:27.577 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.577 22:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:27.836 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:27.836 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:27.836 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:28.096 [2024-07-12 22:29:38.244514] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:28.096 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:28.096 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:28.096 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.096 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:28.356 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:28.356 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:28.356 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:28.616 [2024-07-12 22:29:38.744408] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:28.616 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:28.616 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:28.616 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.616 22:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:28.876 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:28.876 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:28.876 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:29.135 [2024-07-12 22:29:39.246217] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:29.135 [2024-07-12 22:29:39.246303] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:29.135 [2024-07-12 22:29:39.258949] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.135 [2024-07-12 22:29:39.258986] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:29.135 [2024-07-12 22:29:39.258998] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe3e350 name Existed_Raid, state offline 00:22:29.135 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:29.135 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:29.135 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.135 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:29.395 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:29.654 BaseBdev2 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:29.654 22:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:29.913 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:29.913 [ 00:22:29.913 { 00:22:29.913 "name": "BaseBdev2", 00:22:29.913 "aliases": [ 00:22:29.913 "1b2ab606-c168-48d0-98db-2cdc38ab2189" 00:22:29.913 ], 00:22:29.913 "product_name": "Malloc disk", 00:22:29.913 "block_size": 512, 00:22:29.913 "num_blocks": 65536, 00:22:29.913 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:29.913 "assigned_rate_limits": { 00:22:29.913 "rw_ios_per_sec": 0, 00:22:29.913 "rw_mbytes_per_sec": 0, 00:22:29.913 "r_mbytes_per_sec": 0, 00:22:29.913 "w_mbytes_per_sec": 0 00:22:29.913 }, 00:22:29.913 "claimed": false, 00:22:29.913 "zoned": false, 00:22:29.913 "supported_io_types": { 00:22:29.913 "read": true, 00:22:29.913 "write": true, 00:22:29.913 "unmap": true, 00:22:29.913 "flush": true, 00:22:29.913 "reset": true, 00:22:29.913 "nvme_admin": false, 00:22:29.913 "nvme_io": false, 00:22:29.913 "nvme_io_md": false, 00:22:29.913 "write_zeroes": true, 00:22:29.913 "zcopy": true, 00:22:29.913 "get_zone_info": false, 00:22:29.913 "zone_management": false, 00:22:29.913 "zone_append": false, 00:22:29.913 "compare": false, 00:22:29.913 "compare_and_write": false, 00:22:29.913 "abort": true, 00:22:29.913 "seek_hole": false, 00:22:29.913 "seek_data": false, 00:22:29.913 "copy": true, 00:22:29.913 "nvme_iov_md": false 00:22:29.913 }, 00:22:29.913 "memory_domains": [ 00:22:29.913 { 00:22:29.913 "dma_device_id": "system", 00:22:29.913 "dma_device_type": 1 00:22:29.913 }, 00:22:29.913 { 00:22:29.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:29.913 "dma_device_type": 2 00:22:29.913 } 00:22:29.913 ], 00:22:29.913 "driver_specific": {} 00:22:29.913 } 00:22:29.913 ] 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:30.171 BaseBdev3 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:30.171 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:30.430 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:30.690 [ 00:22:30.690 { 00:22:30.690 "name": "BaseBdev3", 00:22:30.690 "aliases": [ 00:22:30.690 "bdde2816-6986-4cb9-8e7e-8823f07e0f0d" 00:22:30.690 ], 00:22:30.690 "product_name": "Malloc disk", 00:22:30.690 "block_size": 512, 00:22:30.690 "num_blocks": 65536, 00:22:30.690 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:30.690 "assigned_rate_limits": { 00:22:30.690 "rw_ios_per_sec": 0, 00:22:30.690 "rw_mbytes_per_sec": 0, 00:22:30.690 "r_mbytes_per_sec": 0, 00:22:30.690 "w_mbytes_per_sec": 0 00:22:30.690 }, 00:22:30.690 "claimed": false, 00:22:30.690 "zoned": false, 00:22:30.690 "supported_io_types": { 00:22:30.690 "read": true, 00:22:30.690 "write": true, 00:22:30.690 "unmap": true, 00:22:30.690 "flush": true, 00:22:30.690 "reset": true, 00:22:30.690 "nvme_admin": false, 00:22:30.690 "nvme_io": false, 00:22:30.690 "nvme_io_md": false, 00:22:30.690 "write_zeroes": true, 00:22:30.690 "zcopy": true, 00:22:30.690 "get_zone_info": false, 00:22:30.690 "zone_management": false, 00:22:30.690 "zone_append": false, 00:22:30.690 "compare": false, 00:22:30.690 "compare_and_write": false, 00:22:30.690 "abort": true, 00:22:30.690 "seek_hole": false, 00:22:30.690 "seek_data": false, 00:22:30.690 "copy": true, 00:22:30.690 "nvme_iov_md": false 00:22:30.690 }, 00:22:30.690 "memory_domains": [ 00:22:30.690 { 00:22:30.690 "dma_device_id": "system", 00:22:30.690 "dma_device_type": 1 00:22:30.690 }, 00:22:30.690 { 00:22:30.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.690 "dma_device_type": 2 00:22:30.690 } 00:22:30.690 ], 00:22:30.690 "driver_specific": {} 00:22:30.690 } 00:22:30.690 ] 00:22:30.690 22:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:30.690 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:30.690 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:30.690 22:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:30.950 BaseBdev4 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:30.950 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:31.210 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:31.569 [ 00:22:31.569 { 00:22:31.569 "name": "BaseBdev4", 00:22:31.569 "aliases": [ 00:22:31.569 "6d24e34e-afc3-40a3-ad74-2d4be94e1efc" 00:22:31.569 ], 00:22:31.569 "product_name": "Malloc disk", 00:22:31.569 "block_size": 512, 00:22:31.569 "num_blocks": 65536, 00:22:31.569 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:31.569 "assigned_rate_limits": { 00:22:31.569 "rw_ios_per_sec": 0, 00:22:31.569 "rw_mbytes_per_sec": 0, 00:22:31.569 "r_mbytes_per_sec": 0, 00:22:31.569 "w_mbytes_per_sec": 0 00:22:31.569 }, 00:22:31.569 "claimed": false, 00:22:31.569 "zoned": false, 00:22:31.569 "supported_io_types": { 00:22:31.569 "read": true, 00:22:31.569 "write": true, 00:22:31.570 "unmap": true, 00:22:31.570 "flush": true, 00:22:31.570 "reset": true, 00:22:31.570 "nvme_admin": false, 00:22:31.570 "nvme_io": false, 00:22:31.570 "nvme_io_md": false, 00:22:31.570 "write_zeroes": true, 00:22:31.570 "zcopy": true, 00:22:31.570 "get_zone_info": false, 00:22:31.570 "zone_management": false, 00:22:31.570 "zone_append": false, 00:22:31.570 "compare": false, 00:22:31.570 "compare_and_write": false, 00:22:31.570 "abort": true, 00:22:31.570 "seek_hole": false, 00:22:31.570 "seek_data": false, 00:22:31.570 "copy": true, 00:22:31.570 "nvme_iov_md": false 00:22:31.570 }, 00:22:31.570 "memory_domains": [ 00:22:31.570 { 00:22:31.570 "dma_device_id": "system", 00:22:31.570 "dma_device_type": 1 00:22:31.570 }, 00:22:31.570 { 00:22:31.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.570 "dma_device_type": 2 00:22:31.570 } 00:22:31.570 ], 00:22:31.570 "driver_specific": {} 00:22:31.570 } 00:22:31.570 ] 00:22:31.570 22:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:31.570 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:31.570 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:31.570 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:31.829 [2024-07-12 22:29:41.932158] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:31.829 [2024-07-12 22:29:41.932198] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:31.829 [2024-07-12 22:29:41.932218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:31.829 [2024-07-12 22:29:41.933537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:31.829 [2024-07-12 22:29:41.933579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.829 22:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:32.088 22:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.088 "name": "Existed_Raid", 00:22:32.088 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:32.088 "strip_size_kb": 0, 00:22:32.088 "state": "configuring", 00:22:32.088 "raid_level": "raid1", 00:22:32.088 "superblock": true, 00:22:32.088 "num_base_bdevs": 4, 00:22:32.088 "num_base_bdevs_discovered": 3, 00:22:32.088 "num_base_bdevs_operational": 4, 00:22:32.088 "base_bdevs_list": [ 00:22:32.088 { 00:22:32.088 "name": "BaseBdev1", 00:22:32.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.088 "is_configured": false, 00:22:32.088 "data_offset": 0, 00:22:32.088 "data_size": 0 00:22:32.088 }, 00:22:32.088 { 00:22:32.088 "name": "BaseBdev2", 00:22:32.088 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:32.088 "is_configured": true, 00:22:32.088 "data_offset": 2048, 00:22:32.088 "data_size": 63488 00:22:32.088 }, 00:22:32.088 { 00:22:32.088 "name": "BaseBdev3", 00:22:32.088 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:32.088 "is_configured": true, 00:22:32.088 "data_offset": 2048, 00:22:32.088 "data_size": 63488 00:22:32.088 }, 00:22:32.088 { 00:22:32.088 "name": "BaseBdev4", 00:22:32.088 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:32.088 "is_configured": true, 00:22:32.088 "data_offset": 2048, 00:22:32.088 "data_size": 63488 00:22:32.088 } 00:22:32.088 ] 00:22:32.088 }' 00:22:32.088 22:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.088 22:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:32.666 22:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:32.666 [2024-07-12 22:29:42.986939] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.925 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.184 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.184 "name": "Existed_Raid", 00:22:33.184 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:33.184 "strip_size_kb": 0, 00:22:33.184 "state": "configuring", 00:22:33.184 "raid_level": "raid1", 00:22:33.184 "superblock": true, 00:22:33.184 "num_base_bdevs": 4, 00:22:33.184 "num_base_bdevs_discovered": 2, 00:22:33.184 "num_base_bdevs_operational": 4, 00:22:33.184 "base_bdevs_list": [ 00:22:33.184 { 00:22:33.184 "name": "BaseBdev1", 00:22:33.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.185 "is_configured": false, 00:22:33.185 "data_offset": 0, 00:22:33.185 "data_size": 0 00:22:33.185 }, 00:22:33.185 { 00:22:33.185 "name": null, 00:22:33.185 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:33.185 "is_configured": false, 00:22:33.185 "data_offset": 2048, 00:22:33.185 "data_size": 63488 00:22:33.185 }, 00:22:33.185 { 00:22:33.185 "name": "BaseBdev3", 00:22:33.185 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:33.185 "is_configured": true, 00:22:33.185 "data_offset": 2048, 00:22:33.185 "data_size": 63488 00:22:33.185 }, 00:22:33.185 { 00:22:33.185 "name": "BaseBdev4", 00:22:33.185 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:33.185 "is_configured": true, 00:22:33.185 "data_offset": 2048, 00:22:33.185 "data_size": 63488 00:22:33.185 } 00:22:33.185 ] 00:22:33.185 }' 00:22:33.185 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.185 22:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.753 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.753 22:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:33.753 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:33.753 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:34.012 [2024-07-12 22:29:44.307022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:34.012 BaseBdev1 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:34.012 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:34.272 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:34.531 [ 00:22:34.531 { 00:22:34.531 "name": "BaseBdev1", 00:22:34.531 "aliases": [ 00:22:34.531 "0d46eaa3-882d-46d4-b2de-e97544b58b48" 00:22:34.531 ], 00:22:34.531 "product_name": "Malloc disk", 00:22:34.531 "block_size": 512, 00:22:34.531 "num_blocks": 65536, 00:22:34.531 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:34.531 "assigned_rate_limits": { 00:22:34.531 "rw_ios_per_sec": 0, 00:22:34.531 "rw_mbytes_per_sec": 0, 00:22:34.531 "r_mbytes_per_sec": 0, 00:22:34.531 "w_mbytes_per_sec": 0 00:22:34.531 }, 00:22:34.531 "claimed": true, 00:22:34.531 "claim_type": "exclusive_write", 00:22:34.531 "zoned": false, 00:22:34.531 "supported_io_types": { 00:22:34.531 "read": true, 00:22:34.531 "write": true, 00:22:34.531 "unmap": true, 00:22:34.531 "flush": true, 00:22:34.531 "reset": true, 00:22:34.531 "nvme_admin": false, 00:22:34.531 "nvme_io": false, 00:22:34.531 "nvme_io_md": false, 00:22:34.531 "write_zeroes": true, 00:22:34.531 "zcopy": true, 00:22:34.531 "get_zone_info": false, 00:22:34.531 "zone_management": false, 00:22:34.531 "zone_append": false, 00:22:34.531 "compare": false, 00:22:34.531 "compare_and_write": false, 00:22:34.531 "abort": true, 00:22:34.531 "seek_hole": false, 00:22:34.531 "seek_data": false, 00:22:34.531 "copy": true, 00:22:34.531 "nvme_iov_md": false 00:22:34.531 }, 00:22:34.531 "memory_domains": [ 00:22:34.531 { 00:22:34.531 "dma_device_id": "system", 00:22:34.531 "dma_device_type": 1 00:22:34.531 }, 00:22:34.531 { 00:22:34.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.531 "dma_device_type": 2 00:22:34.531 } 00:22:34.531 ], 00:22:34.531 "driver_specific": {} 00:22:34.531 } 00:22:34.531 ] 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.531 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.532 22:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.791 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.791 "name": "Existed_Raid", 00:22:34.791 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:34.791 "strip_size_kb": 0, 00:22:34.791 "state": "configuring", 00:22:34.791 "raid_level": "raid1", 00:22:34.791 "superblock": true, 00:22:34.791 "num_base_bdevs": 4, 00:22:34.791 "num_base_bdevs_discovered": 3, 00:22:34.791 "num_base_bdevs_operational": 4, 00:22:34.791 "base_bdevs_list": [ 00:22:34.791 { 00:22:34.791 "name": "BaseBdev1", 00:22:34.791 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:34.791 "is_configured": true, 00:22:34.791 "data_offset": 2048, 00:22:34.791 "data_size": 63488 00:22:34.791 }, 00:22:34.791 { 00:22:34.791 "name": null, 00:22:34.791 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:34.791 "is_configured": false, 00:22:34.791 "data_offset": 2048, 00:22:34.791 "data_size": 63488 00:22:34.791 }, 00:22:34.791 { 00:22:34.791 "name": "BaseBdev3", 00:22:34.791 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:34.791 "is_configured": true, 00:22:34.791 "data_offset": 2048, 00:22:34.791 "data_size": 63488 00:22:34.791 }, 00:22:34.791 { 00:22:34.791 "name": "BaseBdev4", 00:22:34.791 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:34.791 "is_configured": true, 00:22:34.791 "data_offset": 2048, 00:22:34.791 "data_size": 63488 00:22:34.791 } 00:22:34.791 ] 00:22:34.791 }' 00:22:34.791 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.791 22:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:35.359 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.359 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:35.618 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:35.618 22:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:35.878 [2024-07-12 22:29:46.027610] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.878 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.137 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.137 "name": "Existed_Raid", 00:22:36.137 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:36.137 "strip_size_kb": 0, 00:22:36.137 "state": "configuring", 00:22:36.137 "raid_level": "raid1", 00:22:36.137 "superblock": true, 00:22:36.137 "num_base_bdevs": 4, 00:22:36.137 "num_base_bdevs_discovered": 2, 00:22:36.137 "num_base_bdevs_operational": 4, 00:22:36.137 "base_bdevs_list": [ 00:22:36.137 { 00:22:36.137 "name": "BaseBdev1", 00:22:36.137 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:36.137 "is_configured": true, 00:22:36.137 "data_offset": 2048, 00:22:36.137 "data_size": 63488 00:22:36.137 }, 00:22:36.137 { 00:22:36.137 "name": null, 00:22:36.137 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:36.137 "is_configured": false, 00:22:36.137 "data_offset": 2048, 00:22:36.137 "data_size": 63488 00:22:36.137 }, 00:22:36.137 { 00:22:36.137 "name": null, 00:22:36.137 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:36.137 "is_configured": false, 00:22:36.137 "data_offset": 2048, 00:22:36.137 "data_size": 63488 00:22:36.137 }, 00:22:36.137 { 00:22:36.137 "name": "BaseBdev4", 00:22:36.137 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:36.137 "is_configured": true, 00:22:36.137 "data_offset": 2048, 00:22:36.137 "data_size": 63488 00:22:36.137 } 00:22:36.137 ] 00:22:36.137 }' 00:22:36.138 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.138 22:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:36.707 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.707 22:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:36.966 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:36.966 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:36.966 [2024-07-12 22:29:47.278966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:37.225 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:37.225 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.225 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.226 "name": "Existed_Raid", 00:22:37.226 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:37.226 "strip_size_kb": 0, 00:22:37.226 "state": "configuring", 00:22:37.226 "raid_level": "raid1", 00:22:37.226 "superblock": true, 00:22:37.226 "num_base_bdevs": 4, 00:22:37.226 "num_base_bdevs_discovered": 3, 00:22:37.226 "num_base_bdevs_operational": 4, 00:22:37.226 "base_bdevs_list": [ 00:22:37.226 { 00:22:37.226 "name": "BaseBdev1", 00:22:37.226 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:37.226 "is_configured": true, 00:22:37.226 "data_offset": 2048, 00:22:37.226 "data_size": 63488 00:22:37.226 }, 00:22:37.226 { 00:22:37.226 "name": null, 00:22:37.226 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:37.226 "is_configured": false, 00:22:37.226 "data_offset": 2048, 00:22:37.226 "data_size": 63488 00:22:37.226 }, 00:22:37.226 { 00:22:37.226 "name": "BaseBdev3", 00:22:37.226 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:37.226 "is_configured": true, 00:22:37.226 "data_offset": 2048, 00:22:37.226 "data_size": 63488 00:22:37.226 }, 00:22:37.226 { 00:22:37.226 "name": "BaseBdev4", 00:22:37.226 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:37.226 "is_configured": true, 00:22:37.226 "data_offset": 2048, 00:22:37.226 "data_size": 63488 00:22:37.226 } 00:22:37.226 ] 00:22:37.226 }' 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.226 22:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.165 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.165 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:38.165 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:38.165 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:38.424 [2024-07-12 22:29:48.618512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.424 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.425 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.425 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.684 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.684 "name": "Existed_Raid", 00:22:38.684 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:38.684 "strip_size_kb": 0, 00:22:38.684 "state": "configuring", 00:22:38.684 "raid_level": "raid1", 00:22:38.684 "superblock": true, 00:22:38.684 "num_base_bdevs": 4, 00:22:38.684 "num_base_bdevs_discovered": 2, 00:22:38.684 "num_base_bdevs_operational": 4, 00:22:38.684 "base_bdevs_list": [ 00:22:38.684 { 00:22:38.684 "name": null, 00:22:38.684 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:38.684 "is_configured": false, 00:22:38.684 "data_offset": 2048, 00:22:38.684 "data_size": 63488 00:22:38.684 }, 00:22:38.684 { 00:22:38.684 "name": null, 00:22:38.684 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:38.684 "is_configured": false, 00:22:38.684 "data_offset": 2048, 00:22:38.684 "data_size": 63488 00:22:38.684 }, 00:22:38.684 { 00:22:38.684 "name": "BaseBdev3", 00:22:38.684 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:38.684 "is_configured": true, 00:22:38.684 "data_offset": 2048, 00:22:38.684 "data_size": 63488 00:22:38.684 }, 00:22:38.684 { 00:22:38.684 "name": "BaseBdev4", 00:22:38.684 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:38.684 "is_configured": true, 00:22:38.684 "data_offset": 2048, 00:22:38.684 "data_size": 63488 00:22:38.685 } 00:22:38.685 ] 00:22:38.685 }' 00:22:38.685 22:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.685 22:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:39.253 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.253 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:39.512 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:39.512 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:39.771 [2024-07-12 22:29:49.969085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.771 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:39.772 22:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.031 22:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.031 "name": "Existed_Raid", 00:22:40.031 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:40.031 "strip_size_kb": 0, 00:22:40.031 "state": "configuring", 00:22:40.031 "raid_level": "raid1", 00:22:40.031 "superblock": true, 00:22:40.031 "num_base_bdevs": 4, 00:22:40.031 "num_base_bdevs_discovered": 3, 00:22:40.031 "num_base_bdevs_operational": 4, 00:22:40.031 "base_bdevs_list": [ 00:22:40.031 { 00:22:40.031 "name": null, 00:22:40.031 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:40.031 "is_configured": false, 00:22:40.031 "data_offset": 2048, 00:22:40.031 "data_size": 63488 00:22:40.031 }, 00:22:40.031 { 00:22:40.031 "name": "BaseBdev2", 00:22:40.031 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:40.031 "is_configured": true, 00:22:40.031 "data_offset": 2048, 00:22:40.031 "data_size": 63488 00:22:40.031 }, 00:22:40.031 { 00:22:40.031 "name": "BaseBdev3", 00:22:40.031 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:40.031 "is_configured": true, 00:22:40.031 "data_offset": 2048, 00:22:40.031 "data_size": 63488 00:22:40.031 }, 00:22:40.031 { 00:22:40.031 "name": "BaseBdev4", 00:22:40.031 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:40.031 "is_configured": true, 00:22:40.031 "data_offset": 2048, 00:22:40.031 "data_size": 63488 00:22:40.031 } 00:22:40.031 ] 00:22:40.031 }' 00:22:40.031 22:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.031 22:29:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.599 22:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.599 22:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:40.858 22:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:40.858 22:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.858 22:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:41.117 22:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0d46eaa3-882d-46d4-b2de-e97544b58b48 00:22:41.376 [2024-07-12 22:29:51.533829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:41.376 [2024-07-12 22:29:51.534012] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe40180 00:22:41.376 [2024-07-12 22:29:51.534027] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:41.376 [2024-07-12 22:29:51.534202] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe40c20 00:22:41.376 [2024-07-12 22:29:51.534334] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe40180 00:22:41.376 [2024-07-12 22:29:51.534344] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe40180 00:22:41.376 [2024-07-12 22:29:51.534440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.376 NewBaseBdev 00:22:41.376 22:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:41.376 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:41.376 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:41.376 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:41.377 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:41.377 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:41.377 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:41.635 22:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:41.894 [ 00:22:41.894 { 00:22:41.894 "name": "NewBaseBdev", 00:22:41.894 "aliases": [ 00:22:41.894 "0d46eaa3-882d-46d4-b2de-e97544b58b48" 00:22:41.894 ], 00:22:41.894 "product_name": "Malloc disk", 00:22:41.894 "block_size": 512, 00:22:41.894 "num_blocks": 65536, 00:22:41.894 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:41.894 "assigned_rate_limits": { 00:22:41.894 "rw_ios_per_sec": 0, 00:22:41.894 "rw_mbytes_per_sec": 0, 00:22:41.894 "r_mbytes_per_sec": 0, 00:22:41.894 "w_mbytes_per_sec": 0 00:22:41.894 }, 00:22:41.894 "claimed": true, 00:22:41.894 "claim_type": "exclusive_write", 00:22:41.894 "zoned": false, 00:22:41.894 "supported_io_types": { 00:22:41.894 "read": true, 00:22:41.894 "write": true, 00:22:41.894 "unmap": true, 00:22:41.894 "flush": true, 00:22:41.894 "reset": true, 00:22:41.894 "nvme_admin": false, 00:22:41.894 "nvme_io": false, 00:22:41.894 "nvme_io_md": false, 00:22:41.894 "write_zeroes": true, 00:22:41.894 "zcopy": true, 00:22:41.894 "get_zone_info": false, 00:22:41.894 "zone_management": false, 00:22:41.894 "zone_append": false, 00:22:41.894 "compare": false, 00:22:41.894 "compare_and_write": false, 00:22:41.894 "abort": true, 00:22:41.894 "seek_hole": false, 00:22:41.894 "seek_data": false, 00:22:41.894 "copy": true, 00:22:41.894 "nvme_iov_md": false 00:22:41.894 }, 00:22:41.894 "memory_domains": [ 00:22:41.894 { 00:22:41.894 "dma_device_id": "system", 00:22:41.894 "dma_device_type": 1 00:22:41.894 }, 00:22:41.894 { 00:22:41.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.894 "dma_device_type": 2 00:22:41.894 } 00:22:41.894 ], 00:22:41.894 "driver_specific": {} 00:22:41.894 } 00:22:41.894 ] 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.894 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.153 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.153 "name": "Existed_Raid", 00:22:42.153 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:42.153 "strip_size_kb": 0, 00:22:42.153 "state": "online", 00:22:42.153 "raid_level": "raid1", 00:22:42.153 "superblock": true, 00:22:42.153 "num_base_bdevs": 4, 00:22:42.153 "num_base_bdevs_discovered": 4, 00:22:42.153 "num_base_bdevs_operational": 4, 00:22:42.153 "base_bdevs_list": [ 00:22:42.153 { 00:22:42.153 "name": "NewBaseBdev", 00:22:42.153 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:42.153 "is_configured": true, 00:22:42.153 "data_offset": 2048, 00:22:42.153 "data_size": 63488 00:22:42.153 }, 00:22:42.153 { 00:22:42.153 "name": "BaseBdev2", 00:22:42.153 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:42.153 "is_configured": true, 00:22:42.153 "data_offset": 2048, 00:22:42.153 "data_size": 63488 00:22:42.153 }, 00:22:42.153 { 00:22:42.153 "name": "BaseBdev3", 00:22:42.153 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:42.153 "is_configured": true, 00:22:42.153 "data_offset": 2048, 00:22:42.153 "data_size": 63488 00:22:42.153 }, 00:22:42.153 { 00:22:42.153 "name": "BaseBdev4", 00:22:42.153 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:42.153 "is_configured": true, 00:22:42.153 "data_offset": 2048, 00:22:42.153 "data_size": 63488 00:22:42.153 } 00:22:42.153 ] 00:22:42.153 }' 00:22:42.153 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.153 22:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.732 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:42.733 22:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:42.994 [2024-07-12 22:29:53.086317] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:42.994 "name": "Existed_Raid", 00:22:42.994 "aliases": [ 00:22:42.994 "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c" 00:22:42.994 ], 00:22:42.994 "product_name": "Raid Volume", 00:22:42.994 "block_size": 512, 00:22:42.994 "num_blocks": 63488, 00:22:42.994 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:42.994 "assigned_rate_limits": { 00:22:42.994 "rw_ios_per_sec": 0, 00:22:42.994 "rw_mbytes_per_sec": 0, 00:22:42.994 "r_mbytes_per_sec": 0, 00:22:42.994 "w_mbytes_per_sec": 0 00:22:42.994 }, 00:22:42.994 "claimed": false, 00:22:42.994 "zoned": false, 00:22:42.994 "supported_io_types": { 00:22:42.994 "read": true, 00:22:42.994 "write": true, 00:22:42.994 "unmap": false, 00:22:42.994 "flush": false, 00:22:42.994 "reset": true, 00:22:42.994 "nvme_admin": false, 00:22:42.994 "nvme_io": false, 00:22:42.994 "nvme_io_md": false, 00:22:42.994 "write_zeroes": true, 00:22:42.994 "zcopy": false, 00:22:42.994 "get_zone_info": false, 00:22:42.994 "zone_management": false, 00:22:42.994 "zone_append": false, 00:22:42.994 "compare": false, 00:22:42.994 "compare_and_write": false, 00:22:42.994 "abort": false, 00:22:42.994 "seek_hole": false, 00:22:42.994 "seek_data": false, 00:22:42.994 "copy": false, 00:22:42.994 "nvme_iov_md": false 00:22:42.994 }, 00:22:42.994 "memory_domains": [ 00:22:42.994 { 00:22:42.994 "dma_device_id": "system", 00:22:42.994 "dma_device_type": 1 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.994 "dma_device_type": 2 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "system", 00:22:42.994 "dma_device_type": 1 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.994 "dma_device_type": 2 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "system", 00:22:42.994 "dma_device_type": 1 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.994 "dma_device_type": 2 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "system", 00:22:42.994 "dma_device_type": 1 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.994 "dma_device_type": 2 00:22:42.994 } 00:22:42.994 ], 00:22:42.994 "driver_specific": { 00:22:42.994 "raid": { 00:22:42.994 "uuid": "27cf3a0d-c6f5-4ae7-b7d7-957f0ee3bd3c", 00:22:42.994 "strip_size_kb": 0, 00:22:42.994 "state": "online", 00:22:42.994 "raid_level": "raid1", 00:22:42.994 "superblock": true, 00:22:42.994 "num_base_bdevs": 4, 00:22:42.994 "num_base_bdevs_discovered": 4, 00:22:42.994 "num_base_bdevs_operational": 4, 00:22:42.994 "base_bdevs_list": [ 00:22:42.994 { 00:22:42.994 "name": "NewBaseBdev", 00:22:42.994 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:42.994 "is_configured": true, 00:22:42.994 "data_offset": 2048, 00:22:42.994 "data_size": 63488 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "name": "BaseBdev2", 00:22:42.994 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:42.994 "is_configured": true, 00:22:42.994 "data_offset": 2048, 00:22:42.994 "data_size": 63488 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "name": "BaseBdev3", 00:22:42.994 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:42.994 "is_configured": true, 00:22:42.994 "data_offset": 2048, 00:22:42.994 "data_size": 63488 00:22:42.994 }, 00:22:42.994 { 00:22:42.994 "name": "BaseBdev4", 00:22:42.994 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:42.994 "is_configured": true, 00:22:42.994 "data_offset": 2048, 00:22:42.994 "data_size": 63488 00:22:42.994 } 00:22:42.994 ] 00:22:42.994 } 00:22:42.994 } 00:22:42.994 }' 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:42.994 BaseBdev2 00:22:42.994 BaseBdev3 00:22:42.994 BaseBdev4' 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:42.994 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:43.254 "name": "NewBaseBdev", 00:22:43.254 "aliases": [ 00:22:43.254 "0d46eaa3-882d-46d4-b2de-e97544b58b48" 00:22:43.254 ], 00:22:43.254 "product_name": "Malloc disk", 00:22:43.254 "block_size": 512, 00:22:43.254 "num_blocks": 65536, 00:22:43.254 "uuid": "0d46eaa3-882d-46d4-b2de-e97544b58b48", 00:22:43.254 "assigned_rate_limits": { 00:22:43.254 "rw_ios_per_sec": 0, 00:22:43.254 "rw_mbytes_per_sec": 0, 00:22:43.254 "r_mbytes_per_sec": 0, 00:22:43.254 "w_mbytes_per_sec": 0 00:22:43.254 }, 00:22:43.254 "claimed": true, 00:22:43.254 "claim_type": "exclusive_write", 00:22:43.254 "zoned": false, 00:22:43.254 "supported_io_types": { 00:22:43.254 "read": true, 00:22:43.254 "write": true, 00:22:43.254 "unmap": true, 00:22:43.254 "flush": true, 00:22:43.254 "reset": true, 00:22:43.254 "nvme_admin": false, 00:22:43.254 "nvme_io": false, 00:22:43.254 "nvme_io_md": false, 00:22:43.254 "write_zeroes": true, 00:22:43.254 "zcopy": true, 00:22:43.254 "get_zone_info": false, 00:22:43.254 "zone_management": false, 00:22:43.254 "zone_append": false, 00:22:43.254 "compare": false, 00:22:43.254 "compare_and_write": false, 00:22:43.254 "abort": true, 00:22:43.254 "seek_hole": false, 00:22:43.254 "seek_data": false, 00:22:43.254 "copy": true, 00:22:43.254 "nvme_iov_md": false 00:22:43.254 }, 00:22:43.254 "memory_domains": [ 00:22:43.254 { 00:22:43.254 "dma_device_id": "system", 00:22:43.254 "dma_device_type": 1 00:22:43.254 }, 00:22:43.254 { 00:22:43.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.254 "dma_device_type": 2 00:22:43.254 } 00:22:43.254 ], 00:22:43.254 "driver_specific": {} 00:22:43.254 }' 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:43.254 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:43.513 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:43.772 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:43.772 "name": "BaseBdev2", 00:22:43.772 "aliases": [ 00:22:43.772 "1b2ab606-c168-48d0-98db-2cdc38ab2189" 00:22:43.772 ], 00:22:43.772 "product_name": "Malloc disk", 00:22:43.772 "block_size": 512, 00:22:43.772 "num_blocks": 65536, 00:22:43.772 "uuid": "1b2ab606-c168-48d0-98db-2cdc38ab2189", 00:22:43.772 "assigned_rate_limits": { 00:22:43.772 "rw_ios_per_sec": 0, 00:22:43.772 "rw_mbytes_per_sec": 0, 00:22:43.772 "r_mbytes_per_sec": 0, 00:22:43.772 "w_mbytes_per_sec": 0 00:22:43.772 }, 00:22:43.772 "claimed": true, 00:22:43.772 "claim_type": "exclusive_write", 00:22:43.772 "zoned": false, 00:22:43.772 "supported_io_types": { 00:22:43.772 "read": true, 00:22:43.772 "write": true, 00:22:43.772 "unmap": true, 00:22:43.772 "flush": true, 00:22:43.772 "reset": true, 00:22:43.772 "nvme_admin": false, 00:22:43.772 "nvme_io": false, 00:22:43.772 "nvme_io_md": false, 00:22:43.772 "write_zeroes": true, 00:22:43.772 "zcopy": true, 00:22:43.772 "get_zone_info": false, 00:22:43.772 "zone_management": false, 00:22:43.772 "zone_append": false, 00:22:43.772 "compare": false, 00:22:43.772 "compare_and_write": false, 00:22:43.772 "abort": true, 00:22:43.772 "seek_hole": false, 00:22:43.772 "seek_data": false, 00:22:43.772 "copy": true, 00:22:43.772 "nvme_iov_md": false 00:22:43.772 }, 00:22:43.772 "memory_domains": [ 00:22:43.772 { 00:22:43.772 "dma_device_id": "system", 00:22:43.772 "dma_device_type": 1 00:22:43.772 }, 00:22:43.772 { 00:22:43.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.772 "dma_device_type": 2 00:22:43.772 } 00:22:43.772 ], 00:22:43.772 "driver_specific": {} 00:22:43.772 }' 00:22:43.772 22:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.772 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:43.772 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:43.772 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:44.031 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:44.290 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:44.290 "name": "BaseBdev3", 00:22:44.290 "aliases": [ 00:22:44.290 "bdde2816-6986-4cb9-8e7e-8823f07e0f0d" 00:22:44.290 ], 00:22:44.290 "product_name": "Malloc disk", 00:22:44.290 "block_size": 512, 00:22:44.290 "num_blocks": 65536, 00:22:44.290 "uuid": "bdde2816-6986-4cb9-8e7e-8823f07e0f0d", 00:22:44.290 "assigned_rate_limits": { 00:22:44.290 "rw_ios_per_sec": 0, 00:22:44.290 "rw_mbytes_per_sec": 0, 00:22:44.290 "r_mbytes_per_sec": 0, 00:22:44.290 "w_mbytes_per_sec": 0 00:22:44.290 }, 00:22:44.290 "claimed": true, 00:22:44.290 "claim_type": "exclusive_write", 00:22:44.290 "zoned": false, 00:22:44.290 "supported_io_types": { 00:22:44.290 "read": true, 00:22:44.290 "write": true, 00:22:44.290 "unmap": true, 00:22:44.290 "flush": true, 00:22:44.290 "reset": true, 00:22:44.290 "nvme_admin": false, 00:22:44.290 "nvme_io": false, 00:22:44.290 "nvme_io_md": false, 00:22:44.290 "write_zeroes": true, 00:22:44.290 "zcopy": true, 00:22:44.290 "get_zone_info": false, 00:22:44.290 "zone_management": false, 00:22:44.290 "zone_append": false, 00:22:44.290 "compare": false, 00:22:44.290 "compare_and_write": false, 00:22:44.290 "abort": true, 00:22:44.290 "seek_hole": false, 00:22:44.290 "seek_data": false, 00:22:44.290 "copy": true, 00:22:44.290 "nvme_iov_md": false 00:22:44.290 }, 00:22:44.290 "memory_domains": [ 00:22:44.290 { 00:22:44.290 "dma_device_id": "system", 00:22:44.290 "dma_device_type": 1 00:22:44.290 }, 00:22:44.290 { 00:22:44.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.290 "dma_device_type": 2 00:22:44.290 } 00:22:44.290 ], 00:22:44.290 "driver_specific": {} 00:22:44.290 }' 00:22:44.290 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:44.550 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.809 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.809 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:44.809 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:44.809 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:44.809 22:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.068 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.068 "name": "BaseBdev4", 00:22:45.068 "aliases": [ 00:22:45.068 "6d24e34e-afc3-40a3-ad74-2d4be94e1efc" 00:22:45.068 ], 00:22:45.068 "product_name": "Malloc disk", 00:22:45.068 "block_size": 512, 00:22:45.068 "num_blocks": 65536, 00:22:45.068 "uuid": "6d24e34e-afc3-40a3-ad74-2d4be94e1efc", 00:22:45.068 "assigned_rate_limits": { 00:22:45.068 "rw_ios_per_sec": 0, 00:22:45.068 "rw_mbytes_per_sec": 0, 00:22:45.068 "r_mbytes_per_sec": 0, 00:22:45.068 "w_mbytes_per_sec": 0 00:22:45.068 }, 00:22:45.068 "claimed": true, 00:22:45.068 "claim_type": "exclusive_write", 00:22:45.068 "zoned": false, 00:22:45.068 "supported_io_types": { 00:22:45.068 "read": true, 00:22:45.068 "write": true, 00:22:45.068 "unmap": true, 00:22:45.068 "flush": true, 00:22:45.068 "reset": true, 00:22:45.068 "nvme_admin": false, 00:22:45.068 "nvme_io": false, 00:22:45.068 "nvme_io_md": false, 00:22:45.068 "write_zeroes": true, 00:22:45.068 "zcopy": true, 00:22:45.068 "get_zone_info": false, 00:22:45.068 "zone_management": false, 00:22:45.068 "zone_append": false, 00:22:45.068 "compare": false, 00:22:45.069 "compare_and_write": false, 00:22:45.069 "abort": true, 00:22:45.069 "seek_hole": false, 00:22:45.069 "seek_data": false, 00:22:45.069 "copy": true, 00:22:45.069 "nvme_iov_md": false 00:22:45.069 }, 00:22:45.069 "memory_domains": [ 00:22:45.069 { 00:22:45.069 "dma_device_id": "system", 00:22:45.069 "dma_device_type": 1 00:22:45.069 }, 00:22:45.069 { 00:22:45.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.069 "dma_device_type": 2 00:22:45.069 } 00:22:45.069 ], 00:22:45.069 "driver_specific": {} 00:22:45.069 }' 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.069 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.327 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:45.328 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.328 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.328 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:45.328 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:45.588 [2024-07-12 22:29:55.749129] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:45.588 [2024-07-12 22:29:55.749157] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:45.588 [2024-07-12 22:29:55.749210] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.588 [2024-07-12 22:29:55.749497] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.588 [2024-07-12 22:29:55.749510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe40180 name Existed_Raid, state offline 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 3516223 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3516223 ']' 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 3516223 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3516223 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3516223' 00:22:45.588 killing process with pid 3516223 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 3516223 00:22:45.588 [2024-07-12 22:29:55.820083] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.588 22:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 3516223 00:22:45.588 [2024-07-12 22:29:55.858624] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.847 22:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:45.847 00:22:45.847 real 0m32.153s 00:22:45.847 user 0m58.973s 00:22:45.847 sys 0m5.770s 00:22:45.847 22:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.847 22:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.847 ************************************ 00:22:45.847 END TEST raid_state_function_test_sb 00:22:45.847 ************************************ 00:22:45.847 22:29:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.847 22:29:56 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:45.847 22:29:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:45.847 22:29:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.847 22:29:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.847 ************************************ 00:22:45.847 START TEST raid_superblock_test 00:22:45.847 ************************************ 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:45.847 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=3521020 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 3521020 /var/tmp/spdk-raid.sock 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 3521020 ']' 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:46.107 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:46.107 22:29:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:46.107 [2024-07-12 22:29:56.236075] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:22:46.107 [2024-07-12 22:29:56.236148] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3521020 ] 00:22:46.107 [2024-07-12 22:29:56.366241] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.367 [2024-07-12 22:29:56.469027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.367 [2024-07-12 22:29:56.529041] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.367 [2024-07-12 22:29:56.529070] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:46.936 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:47.196 malloc1 00:22:47.196 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:47.455 [2024-07-12 22:29:57.642211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:47.455 [2024-07-12 22:29:57.642267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.455 [2024-07-12 22:29:57.642289] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1010570 00:22:47.455 [2024-07-12 22:29:57.642302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.455 [2024-07-12 22:29:57.644083] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.455 [2024-07-12 22:29:57.644111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:47.455 pt1 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:47.455 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:47.715 malloc2 00:22:47.715 22:29:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:47.974 [2024-07-12 22:29:58.132264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:47.974 [2024-07-12 22:29:58.132310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.974 [2024-07-12 22:29:58.132327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1011970 00:22:47.974 [2024-07-12 22:29:58.132340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.974 [2024-07-12 22:29:58.133814] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.974 [2024-07-12 22:29:58.133842] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:47.974 pt2 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:47.974 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:48.266 malloc3 00:22:48.266 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:48.525 [2024-07-12 22:29:58.626150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:48.525 [2024-07-12 22:29:58.626200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.525 [2024-07-12 22:29:58.626217] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a8340 00:22:48.525 [2024-07-12 22:29:58.626230] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.525 [2024-07-12 22:29:58.627622] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.525 [2024-07-12 22:29:58.627649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:48.525 pt3 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:48.525 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:48.526 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:48.783 malloc4 00:22:48.784 22:29:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:49.041 [2024-07-12 22:29:59.124267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:49.041 [2024-07-12 22:29:59.124318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.041 [2024-07-12 22:29:59.124339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11aac60 00:22:49.041 [2024-07-12 22:29:59.124352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.041 [2024-07-12 22:29:59.125760] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.041 [2024-07-12 22:29:59.125788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:49.041 pt4 00:22:49.041 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:49.041 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:49.041 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:49.300 [2024-07-12 22:29:59.368939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:49.300 [2024-07-12 22:29:59.370115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:49.300 [2024-07-12 22:29:59.370173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:49.300 [2024-07-12 22:29:59.370216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:49.300 [2024-07-12 22:29:59.370380] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1008530 00:22:49.300 [2024-07-12 22:29:59.370391] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:49.300 [2024-07-12 22:29:59.370574] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1006770 00:22:49.300 [2024-07-12 22:29:59.370723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1008530 00:22:49.300 [2024-07-12 22:29:59.370733] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1008530 00:22:49.300 [2024-07-12 22:29:59.370825] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.300 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.558 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.558 "name": "raid_bdev1", 00:22:49.558 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:49.558 "strip_size_kb": 0, 00:22:49.558 "state": "online", 00:22:49.558 "raid_level": "raid1", 00:22:49.558 "superblock": true, 00:22:49.558 "num_base_bdevs": 4, 00:22:49.558 "num_base_bdevs_discovered": 4, 00:22:49.558 "num_base_bdevs_operational": 4, 00:22:49.558 "base_bdevs_list": [ 00:22:49.558 { 00:22:49.558 "name": "pt1", 00:22:49.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:49.558 "is_configured": true, 00:22:49.558 "data_offset": 2048, 00:22:49.558 "data_size": 63488 00:22:49.558 }, 00:22:49.558 { 00:22:49.558 "name": "pt2", 00:22:49.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:49.558 "is_configured": true, 00:22:49.558 "data_offset": 2048, 00:22:49.558 "data_size": 63488 00:22:49.558 }, 00:22:49.558 { 00:22:49.558 "name": "pt3", 00:22:49.558 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:49.558 "is_configured": true, 00:22:49.558 "data_offset": 2048, 00:22:49.558 "data_size": 63488 00:22:49.558 }, 00:22:49.558 { 00:22:49.558 "name": "pt4", 00:22:49.558 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:49.558 "is_configured": true, 00:22:49.558 "data_offset": 2048, 00:22:49.558 "data_size": 63488 00:22:49.558 } 00:22:49.558 ] 00:22:49.558 }' 00:22:49.558 22:29:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.558 22:29:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:50.124 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:50.124 [2024-07-12 22:30:00.448103] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:50.383 "name": "raid_bdev1", 00:22:50.383 "aliases": [ 00:22:50.383 "16021fd1-39d5-464a-88ee-0d1a07fec421" 00:22:50.383 ], 00:22:50.383 "product_name": "Raid Volume", 00:22:50.383 "block_size": 512, 00:22:50.383 "num_blocks": 63488, 00:22:50.383 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:50.383 "assigned_rate_limits": { 00:22:50.383 "rw_ios_per_sec": 0, 00:22:50.383 "rw_mbytes_per_sec": 0, 00:22:50.383 "r_mbytes_per_sec": 0, 00:22:50.383 "w_mbytes_per_sec": 0 00:22:50.383 }, 00:22:50.383 "claimed": false, 00:22:50.383 "zoned": false, 00:22:50.383 "supported_io_types": { 00:22:50.383 "read": true, 00:22:50.383 "write": true, 00:22:50.383 "unmap": false, 00:22:50.383 "flush": false, 00:22:50.383 "reset": true, 00:22:50.383 "nvme_admin": false, 00:22:50.383 "nvme_io": false, 00:22:50.383 "nvme_io_md": false, 00:22:50.383 "write_zeroes": true, 00:22:50.383 "zcopy": false, 00:22:50.383 "get_zone_info": false, 00:22:50.383 "zone_management": false, 00:22:50.383 "zone_append": false, 00:22:50.383 "compare": false, 00:22:50.383 "compare_and_write": false, 00:22:50.383 "abort": false, 00:22:50.383 "seek_hole": false, 00:22:50.383 "seek_data": false, 00:22:50.383 "copy": false, 00:22:50.383 "nvme_iov_md": false 00:22:50.383 }, 00:22:50.383 "memory_domains": [ 00:22:50.383 { 00:22:50.383 "dma_device_id": "system", 00:22:50.383 "dma_device_type": 1 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.383 "dma_device_type": 2 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "system", 00:22:50.383 "dma_device_type": 1 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.383 "dma_device_type": 2 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "system", 00:22:50.383 "dma_device_type": 1 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.383 "dma_device_type": 2 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "system", 00:22:50.383 "dma_device_type": 1 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.383 "dma_device_type": 2 00:22:50.383 } 00:22:50.383 ], 00:22:50.383 "driver_specific": { 00:22:50.383 "raid": { 00:22:50.383 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:50.383 "strip_size_kb": 0, 00:22:50.383 "state": "online", 00:22:50.383 "raid_level": "raid1", 00:22:50.383 "superblock": true, 00:22:50.383 "num_base_bdevs": 4, 00:22:50.383 "num_base_bdevs_discovered": 4, 00:22:50.383 "num_base_bdevs_operational": 4, 00:22:50.383 "base_bdevs_list": [ 00:22:50.383 { 00:22:50.383 "name": "pt1", 00:22:50.383 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:50.383 "is_configured": true, 00:22:50.383 "data_offset": 2048, 00:22:50.383 "data_size": 63488 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "name": "pt2", 00:22:50.383 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:50.383 "is_configured": true, 00:22:50.383 "data_offset": 2048, 00:22:50.383 "data_size": 63488 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "name": "pt3", 00:22:50.383 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:50.383 "is_configured": true, 00:22:50.383 "data_offset": 2048, 00:22:50.383 "data_size": 63488 00:22:50.383 }, 00:22:50.383 { 00:22:50.383 "name": "pt4", 00:22:50.383 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:50.383 "is_configured": true, 00:22:50.383 "data_offset": 2048, 00:22:50.383 "data_size": 63488 00:22:50.383 } 00:22:50.383 ] 00:22:50.383 } 00:22:50.383 } 00:22:50.383 }' 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:50.383 pt2 00:22:50.383 pt3 00:22:50.383 pt4' 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:50.383 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:50.642 "name": "pt1", 00:22:50.642 "aliases": [ 00:22:50.642 "00000000-0000-0000-0000-000000000001" 00:22:50.642 ], 00:22:50.642 "product_name": "passthru", 00:22:50.642 "block_size": 512, 00:22:50.642 "num_blocks": 65536, 00:22:50.642 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:50.642 "assigned_rate_limits": { 00:22:50.642 "rw_ios_per_sec": 0, 00:22:50.642 "rw_mbytes_per_sec": 0, 00:22:50.642 "r_mbytes_per_sec": 0, 00:22:50.642 "w_mbytes_per_sec": 0 00:22:50.642 }, 00:22:50.642 "claimed": true, 00:22:50.642 "claim_type": "exclusive_write", 00:22:50.642 "zoned": false, 00:22:50.642 "supported_io_types": { 00:22:50.642 "read": true, 00:22:50.642 "write": true, 00:22:50.642 "unmap": true, 00:22:50.642 "flush": true, 00:22:50.642 "reset": true, 00:22:50.642 "nvme_admin": false, 00:22:50.642 "nvme_io": false, 00:22:50.642 "nvme_io_md": false, 00:22:50.642 "write_zeroes": true, 00:22:50.642 "zcopy": true, 00:22:50.642 "get_zone_info": false, 00:22:50.642 "zone_management": false, 00:22:50.642 "zone_append": false, 00:22:50.642 "compare": false, 00:22:50.642 "compare_and_write": false, 00:22:50.642 "abort": true, 00:22:50.642 "seek_hole": false, 00:22:50.642 "seek_data": false, 00:22:50.642 "copy": true, 00:22:50.642 "nvme_iov_md": false 00:22:50.642 }, 00:22:50.642 "memory_domains": [ 00:22:50.642 { 00:22:50.642 "dma_device_id": "system", 00:22:50.642 "dma_device_type": 1 00:22:50.642 }, 00:22:50.642 { 00:22:50.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.642 "dma_device_type": 2 00:22:50.642 } 00:22:50.642 ], 00:22:50.642 "driver_specific": { 00:22:50.642 "passthru": { 00:22:50.642 "name": "pt1", 00:22:50.642 "base_bdev_name": "malloc1" 00:22:50.642 } 00:22:50.642 } 00:22:50.642 }' 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:50.642 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.901 22:30:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:50.901 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:51.160 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:51.160 "name": "pt2", 00:22:51.160 "aliases": [ 00:22:51.160 "00000000-0000-0000-0000-000000000002" 00:22:51.160 ], 00:22:51.160 "product_name": "passthru", 00:22:51.160 "block_size": 512, 00:22:51.160 "num_blocks": 65536, 00:22:51.160 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:51.160 "assigned_rate_limits": { 00:22:51.160 "rw_ios_per_sec": 0, 00:22:51.160 "rw_mbytes_per_sec": 0, 00:22:51.160 "r_mbytes_per_sec": 0, 00:22:51.160 "w_mbytes_per_sec": 0 00:22:51.160 }, 00:22:51.160 "claimed": true, 00:22:51.160 "claim_type": "exclusive_write", 00:22:51.160 "zoned": false, 00:22:51.160 "supported_io_types": { 00:22:51.160 "read": true, 00:22:51.160 "write": true, 00:22:51.160 "unmap": true, 00:22:51.160 "flush": true, 00:22:51.160 "reset": true, 00:22:51.160 "nvme_admin": false, 00:22:51.160 "nvme_io": false, 00:22:51.160 "nvme_io_md": false, 00:22:51.160 "write_zeroes": true, 00:22:51.160 "zcopy": true, 00:22:51.160 "get_zone_info": false, 00:22:51.160 "zone_management": false, 00:22:51.160 "zone_append": false, 00:22:51.160 "compare": false, 00:22:51.160 "compare_and_write": false, 00:22:51.160 "abort": true, 00:22:51.160 "seek_hole": false, 00:22:51.160 "seek_data": false, 00:22:51.160 "copy": true, 00:22:51.160 "nvme_iov_md": false 00:22:51.160 }, 00:22:51.160 "memory_domains": [ 00:22:51.160 { 00:22:51.160 "dma_device_id": "system", 00:22:51.161 "dma_device_type": 1 00:22:51.161 }, 00:22:51.161 { 00:22:51.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.161 "dma_device_type": 2 00:22:51.161 } 00:22:51.161 ], 00:22:51.161 "driver_specific": { 00:22:51.161 "passthru": { 00:22:51.161 "name": "pt2", 00:22:51.161 "base_bdev_name": "malloc2" 00:22:51.161 } 00:22:51.161 } 00:22:51.161 }' 00:22:51.161 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.161 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.161 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:51.161 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.161 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:51.420 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:51.679 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:51.679 "name": "pt3", 00:22:51.679 "aliases": [ 00:22:51.679 "00000000-0000-0000-0000-000000000003" 00:22:51.679 ], 00:22:51.679 "product_name": "passthru", 00:22:51.679 "block_size": 512, 00:22:51.679 "num_blocks": 65536, 00:22:51.679 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:51.679 "assigned_rate_limits": { 00:22:51.679 "rw_ios_per_sec": 0, 00:22:51.679 "rw_mbytes_per_sec": 0, 00:22:51.679 "r_mbytes_per_sec": 0, 00:22:51.679 "w_mbytes_per_sec": 0 00:22:51.679 }, 00:22:51.679 "claimed": true, 00:22:51.679 "claim_type": "exclusive_write", 00:22:51.679 "zoned": false, 00:22:51.679 "supported_io_types": { 00:22:51.679 "read": true, 00:22:51.679 "write": true, 00:22:51.679 "unmap": true, 00:22:51.679 "flush": true, 00:22:51.679 "reset": true, 00:22:51.679 "nvme_admin": false, 00:22:51.679 "nvme_io": false, 00:22:51.679 "nvme_io_md": false, 00:22:51.679 "write_zeroes": true, 00:22:51.679 "zcopy": true, 00:22:51.679 "get_zone_info": false, 00:22:51.679 "zone_management": false, 00:22:51.679 "zone_append": false, 00:22:51.679 "compare": false, 00:22:51.679 "compare_and_write": false, 00:22:51.679 "abort": true, 00:22:51.679 "seek_hole": false, 00:22:51.679 "seek_data": false, 00:22:51.679 "copy": true, 00:22:51.679 "nvme_iov_md": false 00:22:51.679 }, 00:22:51.679 "memory_domains": [ 00:22:51.679 { 00:22:51.679 "dma_device_id": "system", 00:22:51.679 "dma_device_type": 1 00:22:51.679 }, 00:22:51.680 { 00:22:51.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.680 "dma_device_type": 2 00:22:51.680 } 00:22:51.680 ], 00:22:51.680 "driver_specific": { 00:22:51.680 "passthru": { 00:22:51.680 "name": "pt3", 00:22:51.680 "base_bdev_name": "malloc3" 00:22:51.680 } 00:22:51.680 } 00:22:51.680 }' 00:22:51.680 22:30:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.680 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:51.939 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:51.940 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.199 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.199 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:52.199 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:52.199 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:52.199 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:52.458 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:52.458 "name": "pt4", 00:22:52.458 "aliases": [ 00:22:52.458 "00000000-0000-0000-0000-000000000004" 00:22:52.458 ], 00:22:52.458 "product_name": "passthru", 00:22:52.458 "block_size": 512, 00:22:52.458 "num_blocks": 65536, 00:22:52.458 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:52.458 "assigned_rate_limits": { 00:22:52.458 "rw_ios_per_sec": 0, 00:22:52.458 "rw_mbytes_per_sec": 0, 00:22:52.458 "r_mbytes_per_sec": 0, 00:22:52.458 "w_mbytes_per_sec": 0 00:22:52.458 }, 00:22:52.458 "claimed": true, 00:22:52.458 "claim_type": "exclusive_write", 00:22:52.458 "zoned": false, 00:22:52.458 "supported_io_types": { 00:22:52.458 "read": true, 00:22:52.458 "write": true, 00:22:52.458 "unmap": true, 00:22:52.458 "flush": true, 00:22:52.458 "reset": true, 00:22:52.458 "nvme_admin": false, 00:22:52.458 "nvme_io": false, 00:22:52.458 "nvme_io_md": false, 00:22:52.458 "write_zeroes": true, 00:22:52.458 "zcopy": true, 00:22:52.458 "get_zone_info": false, 00:22:52.458 "zone_management": false, 00:22:52.458 "zone_append": false, 00:22:52.458 "compare": false, 00:22:52.458 "compare_and_write": false, 00:22:52.458 "abort": true, 00:22:52.458 "seek_hole": false, 00:22:52.458 "seek_data": false, 00:22:52.458 "copy": true, 00:22:52.458 "nvme_iov_md": false 00:22:52.458 }, 00:22:52.458 "memory_domains": [ 00:22:52.458 { 00:22:52.458 "dma_device_id": "system", 00:22:52.458 "dma_device_type": 1 00:22:52.458 }, 00:22:52.458 { 00:22:52.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:52.458 "dma_device_type": 2 00:22:52.458 } 00:22:52.458 ], 00:22:52.458 "driver_specific": { 00:22:52.458 "passthru": { 00:22:52.458 "name": "pt4", 00:22:52.458 "base_bdev_name": "malloc4" 00:22:52.458 } 00:22:52.458 } 00:22:52.459 }' 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:52.459 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:52.724 22:30:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:52.982 [2024-07-12 22:30:03.147258] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:52.982 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=16021fd1-39d5-464a-88ee-0d1a07fec421 00:22:52.982 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 16021fd1-39d5-464a-88ee-0d1a07fec421 ']' 00:22:52.982 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:53.241 [2024-07-12 22:30:03.379539] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:53.241 [2024-07-12 22:30:03.379568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:53.241 [2024-07-12 22:30:03.379630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:53.241 [2024-07-12 22:30:03.379718] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:53.241 [2024-07-12 22:30:03.379731] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1008530 name raid_bdev1, state offline 00:22:53.241 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.241 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:53.501 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:53.501 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:53.501 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:53.501 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:53.760 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:53.760 22:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:53.760 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:54.020 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:54.020 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:54.020 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:54.280 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:54.280 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:54.541 22:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:54.800 [2024-07-12 22:30:05.055905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:54.800 [2024-07-12 22:30:05.057302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:54.800 [2024-07-12 22:30:05.057348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:54.800 [2024-07-12 22:30:05.057382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:54.800 [2024-07-12 22:30:05.057430] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:54.800 [2024-07-12 22:30:05.057470] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:54.800 [2024-07-12 22:30:05.057494] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:54.800 [2024-07-12 22:30:05.057523] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:54.800 [2024-07-12 22:30:05.057541] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:54.800 [2024-07-12 22:30:05.057552] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b3ff0 name raid_bdev1, state configuring 00:22:54.800 request: 00:22:54.800 { 00:22:54.800 "name": "raid_bdev1", 00:22:54.800 "raid_level": "raid1", 00:22:54.800 "base_bdevs": [ 00:22:54.800 "malloc1", 00:22:54.800 "malloc2", 00:22:54.800 "malloc3", 00:22:54.800 "malloc4" 00:22:54.800 ], 00:22:54.800 "superblock": false, 00:22:54.800 "method": "bdev_raid_create", 00:22:54.800 "req_id": 1 00:22:54.800 } 00:22:54.800 Got JSON-RPC error response 00:22:54.800 response: 00:22:54.800 { 00:22:54.800 "code": -17, 00:22:54.800 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:54.800 } 00:22:54.800 22:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:54.800 22:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:54.800 22:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:54.801 22:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:54.801 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.801 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:55.060 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:55.060 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:55.060 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:55.319 [2024-07-12 22:30:05.553164] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:55.319 [2024-07-12 22:30:05.553218] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.319 [2024-07-12 22:30:05.553242] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10107a0 00:22:55.319 [2024-07-12 22:30:05.553263] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.319 [2024-07-12 22:30:05.554877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.319 [2024-07-12 22:30:05.554906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:55.319 [2024-07-12 22:30:05.554995] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:55.319 [2024-07-12 22:30:05.555024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:55.319 pt1 00:22:55.319 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.320 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.579 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.579 "name": "raid_bdev1", 00:22:55.579 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:55.579 "strip_size_kb": 0, 00:22:55.579 "state": "configuring", 00:22:55.579 "raid_level": "raid1", 00:22:55.579 "superblock": true, 00:22:55.579 "num_base_bdevs": 4, 00:22:55.579 "num_base_bdevs_discovered": 1, 00:22:55.579 "num_base_bdevs_operational": 4, 00:22:55.579 "base_bdevs_list": [ 00:22:55.579 { 00:22:55.579 "name": "pt1", 00:22:55.579 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:55.579 "is_configured": true, 00:22:55.579 "data_offset": 2048, 00:22:55.579 "data_size": 63488 00:22:55.579 }, 00:22:55.579 { 00:22:55.579 "name": null, 00:22:55.579 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:55.579 "is_configured": false, 00:22:55.579 "data_offset": 2048, 00:22:55.579 "data_size": 63488 00:22:55.579 }, 00:22:55.579 { 00:22:55.579 "name": null, 00:22:55.579 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:55.579 "is_configured": false, 00:22:55.579 "data_offset": 2048, 00:22:55.579 "data_size": 63488 00:22:55.579 }, 00:22:55.579 { 00:22:55.579 "name": null, 00:22:55.579 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:55.579 "is_configured": false, 00:22:55.579 "data_offset": 2048, 00:22:55.579 "data_size": 63488 00:22:55.579 } 00:22:55.579 ] 00:22:55.579 }' 00:22:55.579 22:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.579 22:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:56.147 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:56.147 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:56.407 [2024-07-12 22:30:06.499702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:56.407 [2024-07-12 22:30:06.499756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:56.407 [2024-07-12 22:30:06.499777] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a9940 00:22:56.407 [2024-07-12 22:30:06.499789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:56.407 [2024-07-12 22:30:06.500144] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:56.407 [2024-07-12 22:30:06.500162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:56.407 [2024-07-12 22:30:06.500224] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:56.407 [2024-07-12 22:30:06.500242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:56.407 pt2 00:22:56.407 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:56.667 [2024-07-12 22:30:06.744354] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.667 22:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.927 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.927 "name": "raid_bdev1", 00:22:56.927 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:56.927 "strip_size_kb": 0, 00:22:56.927 "state": "configuring", 00:22:56.927 "raid_level": "raid1", 00:22:56.927 "superblock": true, 00:22:56.927 "num_base_bdevs": 4, 00:22:56.927 "num_base_bdevs_discovered": 1, 00:22:56.927 "num_base_bdevs_operational": 4, 00:22:56.927 "base_bdevs_list": [ 00:22:56.927 { 00:22:56.927 "name": "pt1", 00:22:56.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:56.927 "is_configured": true, 00:22:56.927 "data_offset": 2048, 00:22:56.927 "data_size": 63488 00:22:56.927 }, 00:22:56.927 { 00:22:56.927 "name": null, 00:22:56.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:56.927 "is_configured": false, 00:22:56.927 "data_offset": 2048, 00:22:56.927 "data_size": 63488 00:22:56.927 }, 00:22:56.927 { 00:22:56.927 "name": null, 00:22:56.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:56.927 "is_configured": false, 00:22:56.927 "data_offset": 2048, 00:22:56.927 "data_size": 63488 00:22:56.927 }, 00:22:56.927 { 00:22:56.927 "name": null, 00:22:56.927 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:56.927 "is_configured": false, 00:22:56.927 "data_offset": 2048, 00:22:56.927 "data_size": 63488 00:22:56.927 } 00:22:56.927 ] 00:22:56.927 }' 00:22:56.927 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.927 22:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.495 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:57.495 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:57.495 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:57.754 [2024-07-12 22:30:07.827228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:57.754 [2024-07-12 22:30:07.827279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.754 [2024-07-12 22:30:07.827298] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1007060 00:22:57.754 [2024-07-12 22:30:07.827310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.754 [2024-07-12 22:30:07.827656] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.754 [2024-07-12 22:30:07.827673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:57.754 [2024-07-12 22:30:07.827734] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:57.754 [2024-07-12 22:30:07.827753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:57.754 pt2 00:22:57.754 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:57.754 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:57.754 22:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:57.754 [2024-07-12 22:30:08.071869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:57.754 [2024-07-12 22:30:08.071909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.754 [2024-07-12 22:30:08.071937] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10098d0 00:22:57.754 [2024-07-12 22:30:08.071949] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.754 [2024-07-12 22:30:08.072236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.754 [2024-07-12 22:30:08.072253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:57.754 [2024-07-12 22:30:08.072305] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:57.754 [2024-07-12 22:30:08.072322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:57.754 pt3 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:58.013 [2024-07-12 22:30:08.252354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:58.013 [2024-07-12 22:30:08.252388] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:58.013 [2024-07-12 22:30:08.252405] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x100ab80 00:22:58.013 [2024-07-12 22:30:08.252418] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:58.013 [2024-07-12 22:30:08.252710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:58.013 [2024-07-12 22:30:08.252728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:58.013 [2024-07-12 22:30:08.252781] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:58.013 [2024-07-12 22:30:08.252799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:58.013 [2024-07-12 22:30:08.252917] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1007780 00:22:58.013 [2024-07-12 22:30:08.252937] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:58.013 [2024-07-12 22:30:08.253113] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x100cfa0 00:22:58.013 [2024-07-12 22:30:08.253248] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1007780 00:22:58.013 [2024-07-12 22:30:08.253258] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1007780 00:22:58.013 [2024-07-12 22:30:08.253354] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.013 pt4 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.013 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.272 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.272 "name": "raid_bdev1", 00:22:58.272 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:58.272 "strip_size_kb": 0, 00:22:58.272 "state": "online", 00:22:58.272 "raid_level": "raid1", 00:22:58.272 "superblock": true, 00:22:58.272 "num_base_bdevs": 4, 00:22:58.272 "num_base_bdevs_discovered": 4, 00:22:58.272 "num_base_bdevs_operational": 4, 00:22:58.272 "base_bdevs_list": [ 00:22:58.272 { 00:22:58.272 "name": "pt1", 00:22:58.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:58.272 "is_configured": true, 00:22:58.272 "data_offset": 2048, 00:22:58.272 "data_size": 63488 00:22:58.272 }, 00:22:58.272 { 00:22:58.272 "name": "pt2", 00:22:58.272 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:58.272 "is_configured": true, 00:22:58.272 "data_offset": 2048, 00:22:58.272 "data_size": 63488 00:22:58.272 }, 00:22:58.272 { 00:22:58.272 "name": "pt3", 00:22:58.272 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:58.272 "is_configured": true, 00:22:58.272 "data_offset": 2048, 00:22:58.272 "data_size": 63488 00:22:58.272 }, 00:22:58.272 { 00:22:58.272 "name": "pt4", 00:22:58.272 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:58.272 "is_configured": true, 00:22:58.272 "data_offset": 2048, 00:22:58.272 "data_size": 63488 00:22:58.272 } 00:22:58.272 ] 00:22:58.272 }' 00:22:58.272 22:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.272 22:30:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:58.840 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:59.100 [2024-07-12 22:30:09.351587] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:59.100 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:59.100 "name": "raid_bdev1", 00:22:59.100 "aliases": [ 00:22:59.100 "16021fd1-39d5-464a-88ee-0d1a07fec421" 00:22:59.100 ], 00:22:59.100 "product_name": "Raid Volume", 00:22:59.100 "block_size": 512, 00:22:59.100 "num_blocks": 63488, 00:22:59.100 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:59.100 "assigned_rate_limits": { 00:22:59.100 "rw_ios_per_sec": 0, 00:22:59.100 "rw_mbytes_per_sec": 0, 00:22:59.100 "r_mbytes_per_sec": 0, 00:22:59.100 "w_mbytes_per_sec": 0 00:22:59.100 }, 00:22:59.100 "claimed": false, 00:22:59.100 "zoned": false, 00:22:59.100 "supported_io_types": { 00:22:59.100 "read": true, 00:22:59.100 "write": true, 00:22:59.100 "unmap": false, 00:22:59.100 "flush": false, 00:22:59.100 "reset": true, 00:22:59.100 "nvme_admin": false, 00:22:59.100 "nvme_io": false, 00:22:59.100 "nvme_io_md": false, 00:22:59.100 "write_zeroes": true, 00:22:59.100 "zcopy": false, 00:22:59.100 "get_zone_info": false, 00:22:59.100 "zone_management": false, 00:22:59.100 "zone_append": false, 00:22:59.100 "compare": false, 00:22:59.100 "compare_and_write": false, 00:22:59.100 "abort": false, 00:22:59.100 "seek_hole": false, 00:22:59.100 "seek_data": false, 00:22:59.100 "copy": false, 00:22:59.100 "nvme_iov_md": false 00:22:59.100 }, 00:22:59.100 "memory_domains": [ 00:22:59.100 { 00:22:59.100 "dma_device_id": "system", 00:22:59.100 "dma_device_type": 1 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.100 "dma_device_type": 2 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "system", 00:22:59.100 "dma_device_type": 1 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.100 "dma_device_type": 2 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "system", 00:22:59.100 "dma_device_type": 1 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.100 "dma_device_type": 2 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "system", 00:22:59.100 "dma_device_type": 1 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.100 "dma_device_type": 2 00:22:59.100 } 00:22:59.100 ], 00:22:59.100 "driver_specific": { 00:22:59.100 "raid": { 00:22:59.100 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:22:59.100 "strip_size_kb": 0, 00:22:59.100 "state": "online", 00:22:59.100 "raid_level": "raid1", 00:22:59.100 "superblock": true, 00:22:59.100 "num_base_bdevs": 4, 00:22:59.100 "num_base_bdevs_discovered": 4, 00:22:59.100 "num_base_bdevs_operational": 4, 00:22:59.100 "base_bdevs_list": [ 00:22:59.100 { 00:22:59.100 "name": "pt1", 00:22:59.100 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.100 "is_configured": true, 00:22:59.100 "data_offset": 2048, 00:22:59.100 "data_size": 63488 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "name": "pt2", 00:22:59.100 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.100 "is_configured": true, 00:22:59.100 "data_offset": 2048, 00:22:59.100 "data_size": 63488 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "name": "pt3", 00:22:59.100 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:59.100 "is_configured": true, 00:22:59.100 "data_offset": 2048, 00:22:59.100 "data_size": 63488 00:22:59.100 }, 00:22:59.100 { 00:22:59.100 "name": "pt4", 00:22:59.100 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:59.100 "is_configured": true, 00:22:59.100 "data_offset": 2048, 00:22:59.100 "data_size": 63488 00:22:59.100 } 00:22:59.100 ] 00:22:59.100 } 00:22:59.100 } 00:22:59.100 }' 00:22:59.100 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:59.100 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:59.100 pt2 00:22:59.100 pt3 00:22:59.100 pt4' 00:22:59.100 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:59.100 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:59.101 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:59.360 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:59.360 "name": "pt1", 00:22:59.360 "aliases": [ 00:22:59.360 "00000000-0000-0000-0000-000000000001" 00:22:59.360 ], 00:22:59.360 "product_name": "passthru", 00:22:59.360 "block_size": 512, 00:22:59.360 "num_blocks": 65536, 00:22:59.360 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.360 "assigned_rate_limits": { 00:22:59.360 "rw_ios_per_sec": 0, 00:22:59.360 "rw_mbytes_per_sec": 0, 00:22:59.360 "r_mbytes_per_sec": 0, 00:22:59.360 "w_mbytes_per_sec": 0 00:22:59.360 }, 00:22:59.360 "claimed": true, 00:22:59.360 "claim_type": "exclusive_write", 00:22:59.360 "zoned": false, 00:22:59.360 "supported_io_types": { 00:22:59.360 "read": true, 00:22:59.360 "write": true, 00:22:59.360 "unmap": true, 00:22:59.360 "flush": true, 00:22:59.360 "reset": true, 00:22:59.360 "nvme_admin": false, 00:22:59.360 "nvme_io": false, 00:22:59.360 "nvme_io_md": false, 00:22:59.360 "write_zeroes": true, 00:22:59.360 "zcopy": true, 00:22:59.360 "get_zone_info": false, 00:22:59.360 "zone_management": false, 00:22:59.360 "zone_append": false, 00:22:59.360 "compare": false, 00:22:59.360 "compare_and_write": false, 00:22:59.360 "abort": true, 00:22:59.360 "seek_hole": false, 00:22:59.360 "seek_data": false, 00:22:59.360 "copy": true, 00:22:59.360 "nvme_iov_md": false 00:22:59.360 }, 00:22:59.360 "memory_domains": [ 00:22:59.360 { 00:22:59.360 "dma_device_id": "system", 00:22:59.360 "dma_device_type": 1 00:22:59.360 }, 00:22:59.360 { 00:22:59.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:59.360 "dma_device_type": 2 00:22:59.360 } 00:22:59.360 ], 00:22:59.360 "driver_specific": { 00:22:59.360 "passthru": { 00:22:59.360 "name": "pt1", 00:22:59.360 "base_bdev_name": "malloc1" 00:22:59.360 } 00:22:59.360 } 00:22:59.360 }' 00:22:59.360 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.619 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:59.878 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:59.878 22:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.878 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:59.878 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:59.878 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:59.878 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:59.878 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.136 "name": "pt2", 00:23:00.136 "aliases": [ 00:23:00.136 "00000000-0000-0000-0000-000000000002" 00:23:00.136 ], 00:23:00.136 "product_name": "passthru", 00:23:00.136 "block_size": 512, 00:23:00.136 "num_blocks": 65536, 00:23:00.136 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.136 "assigned_rate_limits": { 00:23:00.136 "rw_ios_per_sec": 0, 00:23:00.136 "rw_mbytes_per_sec": 0, 00:23:00.136 "r_mbytes_per_sec": 0, 00:23:00.136 "w_mbytes_per_sec": 0 00:23:00.136 }, 00:23:00.136 "claimed": true, 00:23:00.136 "claim_type": "exclusive_write", 00:23:00.136 "zoned": false, 00:23:00.136 "supported_io_types": { 00:23:00.136 "read": true, 00:23:00.136 "write": true, 00:23:00.136 "unmap": true, 00:23:00.136 "flush": true, 00:23:00.136 "reset": true, 00:23:00.136 "nvme_admin": false, 00:23:00.136 "nvme_io": false, 00:23:00.136 "nvme_io_md": false, 00:23:00.136 "write_zeroes": true, 00:23:00.136 "zcopy": true, 00:23:00.136 "get_zone_info": false, 00:23:00.136 "zone_management": false, 00:23:00.136 "zone_append": false, 00:23:00.136 "compare": false, 00:23:00.136 "compare_and_write": false, 00:23:00.136 "abort": true, 00:23:00.136 "seek_hole": false, 00:23:00.136 "seek_data": false, 00:23:00.136 "copy": true, 00:23:00.136 "nvme_iov_md": false 00:23:00.136 }, 00:23:00.136 "memory_domains": [ 00:23:00.136 { 00:23:00.136 "dma_device_id": "system", 00:23:00.136 "dma_device_type": 1 00:23:00.136 }, 00:23:00.136 { 00:23:00.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.136 "dma_device_type": 2 00:23:00.136 } 00:23:00.136 ], 00:23:00.136 "driver_specific": { 00:23:00.136 "passthru": { 00:23:00.136 "name": "pt2", 00:23:00.136 "base_bdev_name": "malloc2" 00:23:00.136 } 00:23:00.136 } 00:23:00.136 }' 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.136 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:00.395 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.654 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.654 "name": "pt3", 00:23:00.654 "aliases": [ 00:23:00.654 "00000000-0000-0000-0000-000000000003" 00:23:00.654 ], 00:23:00.654 "product_name": "passthru", 00:23:00.654 "block_size": 512, 00:23:00.654 "num_blocks": 65536, 00:23:00.654 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:00.654 "assigned_rate_limits": { 00:23:00.654 "rw_ios_per_sec": 0, 00:23:00.654 "rw_mbytes_per_sec": 0, 00:23:00.654 "r_mbytes_per_sec": 0, 00:23:00.654 "w_mbytes_per_sec": 0 00:23:00.654 }, 00:23:00.654 "claimed": true, 00:23:00.654 "claim_type": "exclusive_write", 00:23:00.654 "zoned": false, 00:23:00.654 "supported_io_types": { 00:23:00.655 "read": true, 00:23:00.655 "write": true, 00:23:00.655 "unmap": true, 00:23:00.655 "flush": true, 00:23:00.655 "reset": true, 00:23:00.655 "nvme_admin": false, 00:23:00.655 "nvme_io": false, 00:23:00.655 "nvme_io_md": false, 00:23:00.655 "write_zeroes": true, 00:23:00.655 "zcopy": true, 00:23:00.655 "get_zone_info": false, 00:23:00.655 "zone_management": false, 00:23:00.655 "zone_append": false, 00:23:00.655 "compare": false, 00:23:00.655 "compare_and_write": false, 00:23:00.655 "abort": true, 00:23:00.655 "seek_hole": false, 00:23:00.655 "seek_data": false, 00:23:00.655 "copy": true, 00:23:00.655 "nvme_iov_md": false 00:23:00.655 }, 00:23:00.655 "memory_domains": [ 00:23:00.655 { 00:23:00.655 "dma_device_id": "system", 00:23:00.655 "dma_device_type": 1 00:23:00.655 }, 00:23:00.655 { 00:23:00.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.655 "dma_device_type": 2 00:23:00.655 } 00:23:00.655 ], 00:23:00.655 "driver_specific": { 00:23:00.655 "passthru": { 00:23:00.655 "name": "pt3", 00:23:00.655 "base_bdev_name": "malloc3" 00:23:00.655 } 00:23:00.655 } 00:23:00.655 }' 00:23:00.655 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.655 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.914 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:00.914 22:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.914 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:01.173 "name": "pt4", 00:23:01.173 "aliases": [ 00:23:01.173 "00000000-0000-0000-0000-000000000004" 00:23:01.173 ], 00:23:01.173 "product_name": "passthru", 00:23:01.173 "block_size": 512, 00:23:01.173 "num_blocks": 65536, 00:23:01.173 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:01.173 "assigned_rate_limits": { 00:23:01.173 "rw_ios_per_sec": 0, 00:23:01.173 "rw_mbytes_per_sec": 0, 00:23:01.173 "r_mbytes_per_sec": 0, 00:23:01.173 "w_mbytes_per_sec": 0 00:23:01.173 }, 00:23:01.173 "claimed": true, 00:23:01.173 "claim_type": "exclusive_write", 00:23:01.173 "zoned": false, 00:23:01.173 "supported_io_types": { 00:23:01.173 "read": true, 00:23:01.173 "write": true, 00:23:01.173 "unmap": true, 00:23:01.173 "flush": true, 00:23:01.173 "reset": true, 00:23:01.173 "nvme_admin": false, 00:23:01.173 "nvme_io": false, 00:23:01.173 "nvme_io_md": false, 00:23:01.173 "write_zeroes": true, 00:23:01.173 "zcopy": true, 00:23:01.173 "get_zone_info": false, 00:23:01.173 "zone_management": false, 00:23:01.173 "zone_append": false, 00:23:01.173 "compare": false, 00:23:01.173 "compare_and_write": false, 00:23:01.173 "abort": true, 00:23:01.173 "seek_hole": false, 00:23:01.173 "seek_data": false, 00:23:01.173 "copy": true, 00:23:01.173 "nvme_iov_md": false 00:23:01.173 }, 00:23:01.173 "memory_domains": [ 00:23:01.173 { 00:23:01.173 "dma_device_id": "system", 00:23:01.173 "dma_device_type": 1 00:23:01.173 }, 00:23:01.173 { 00:23:01.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:01.173 "dma_device_type": 2 00:23:01.173 } 00:23:01.173 ], 00:23:01.173 "driver_specific": { 00:23:01.173 "passthru": { 00:23:01.173 "name": "pt4", 00:23:01.173 "base_bdev_name": "malloc4" 00:23:01.173 } 00:23:01.173 } 00:23:01.173 }' 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.173 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:01.434 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.693 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.693 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:01.693 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:01.693 22:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:01.693 [2024-07-12 22:30:12.018659] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:01.952 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 16021fd1-39d5-464a-88ee-0d1a07fec421 '!=' 16021fd1-39d5-464a-88ee-0d1a07fec421 ']' 00:23:01.952 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:01.952 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:01.952 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:01.952 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:01.952 [2024-07-12 22:30:12.267036] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:02.211 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.212 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.212 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.212 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.212 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.212 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.471 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.471 "name": "raid_bdev1", 00:23:02.471 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:02.471 "strip_size_kb": 0, 00:23:02.471 "state": "online", 00:23:02.471 "raid_level": "raid1", 00:23:02.471 "superblock": true, 00:23:02.471 "num_base_bdevs": 4, 00:23:02.471 "num_base_bdevs_discovered": 3, 00:23:02.471 "num_base_bdevs_operational": 3, 00:23:02.471 "base_bdevs_list": [ 00:23:02.471 { 00:23:02.471 "name": null, 00:23:02.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.471 "is_configured": false, 00:23:02.471 "data_offset": 2048, 00:23:02.471 "data_size": 63488 00:23:02.471 }, 00:23:02.471 { 00:23:02.471 "name": "pt2", 00:23:02.471 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:02.471 "is_configured": true, 00:23:02.471 "data_offset": 2048, 00:23:02.471 "data_size": 63488 00:23:02.471 }, 00:23:02.471 { 00:23:02.471 "name": "pt3", 00:23:02.471 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:02.471 "is_configured": true, 00:23:02.471 "data_offset": 2048, 00:23:02.471 "data_size": 63488 00:23:02.471 }, 00:23:02.471 { 00:23:02.471 "name": "pt4", 00:23:02.471 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:02.471 "is_configured": true, 00:23:02.471 "data_offset": 2048, 00:23:02.471 "data_size": 63488 00:23:02.471 } 00:23:02.471 ] 00:23:02.471 }' 00:23:02.471 22:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.471 22:30:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.039 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:03.039 [2024-07-12 22:30:13.345874] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:03.039 [2024-07-12 22:30:13.345903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:03.039 [2024-07-12 22:30:13.345968] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:03.039 [2024-07-12 22:30:13.346035] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:03.039 [2024-07-12 22:30:13.346047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1007780 name raid_bdev1, state offline 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:03.298 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:03.557 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:03.557 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:03.557 22:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:03.816 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:03.816 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:03.816 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:04.076 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:04.076 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:04.076 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:04.076 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:04.076 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:04.337 [2024-07-12 22:30:14.549006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:04.337 [2024-07-12 22:30:14.549063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.337 [2024-07-12 22:30:14.549085] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11aa700 00:23:04.337 [2024-07-12 22:30:14.549099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.337 [2024-07-12 22:30:14.550785] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.337 [2024-07-12 22:30:14.550817] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:04.337 [2024-07-12 22:30:14.550895] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:04.337 [2024-07-12 22:30:14.550936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:04.337 pt2 00:23:04.337 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:04.337 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.337 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:04.337 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.338 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.667 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.667 "name": "raid_bdev1", 00:23:04.667 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:04.667 "strip_size_kb": 0, 00:23:04.667 "state": "configuring", 00:23:04.667 "raid_level": "raid1", 00:23:04.667 "superblock": true, 00:23:04.667 "num_base_bdevs": 4, 00:23:04.667 "num_base_bdevs_discovered": 1, 00:23:04.667 "num_base_bdevs_operational": 3, 00:23:04.667 "base_bdevs_list": [ 00:23:04.667 { 00:23:04.667 "name": null, 00:23:04.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.667 "is_configured": false, 00:23:04.667 "data_offset": 2048, 00:23:04.667 "data_size": 63488 00:23:04.667 }, 00:23:04.667 { 00:23:04.667 "name": "pt2", 00:23:04.667 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:04.667 "is_configured": true, 00:23:04.667 "data_offset": 2048, 00:23:04.667 "data_size": 63488 00:23:04.667 }, 00:23:04.667 { 00:23:04.667 "name": null, 00:23:04.667 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:04.667 "is_configured": false, 00:23:04.667 "data_offset": 2048, 00:23:04.667 "data_size": 63488 00:23:04.667 }, 00:23:04.667 { 00:23:04.667 "name": null, 00:23:04.667 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:04.667 "is_configured": false, 00:23:04.667 "data_offset": 2048, 00:23:04.667 "data_size": 63488 00:23:04.667 } 00:23:04.667 ] 00:23:04.667 }' 00:23:04.667 22:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.667 22:30:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.235 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:05.235 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:05.235 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:05.495 [2024-07-12 22:30:15.611816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:05.495 [2024-07-12 22:30:15.611870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.495 [2024-07-12 22:30:15.611894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1010a10 00:23:05.495 [2024-07-12 22:30:15.611907] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.495 [2024-07-12 22:30:15.612257] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.495 [2024-07-12 22:30:15.612276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:05.495 [2024-07-12 22:30:15.612340] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:05.495 [2024-07-12 22:30:15.612359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:05.495 pt3 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.495 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.755 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.755 "name": "raid_bdev1", 00:23:05.755 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:05.755 "strip_size_kb": 0, 00:23:05.755 "state": "configuring", 00:23:05.755 "raid_level": "raid1", 00:23:05.755 "superblock": true, 00:23:05.755 "num_base_bdevs": 4, 00:23:05.755 "num_base_bdevs_discovered": 2, 00:23:05.755 "num_base_bdevs_operational": 3, 00:23:05.755 "base_bdevs_list": [ 00:23:05.755 { 00:23:05.755 "name": null, 00:23:05.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.755 "is_configured": false, 00:23:05.755 "data_offset": 2048, 00:23:05.755 "data_size": 63488 00:23:05.755 }, 00:23:05.755 { 00:23:05.755 "name": "pt2", 00:23:05.755 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:05.755 "is_configured": true, 00:23:05.755 "data_offset": 2048, 00:23:05.755 "data_size": 63488 00:23:05.755 }, 00:23:05.755 { 00:23:05.755 "name": "pt3", 00:23:05.755 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:05.755 "is_configured": true, 00:23:05.755 "data_offset": 2048, 00:23:05.755 "data_size": 63488 00:23:05.755 }, 00:23:05.755 { 00:23:05.755 "name": null, 00:23:05.755 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:05.755 "is_configured": false, 00:23:05.755 "data_offset": 2048, 00:23:05.755 "data_size": 63488 00:23:05.755 } 00:23:05.755 ] 00:23:05.755 }' 00:23:05.755 22:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.755 22:30:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:06.324 [2024-07-12 22:30:16.622497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:06.324 [2024-07-12 22:30:16.622548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.324 [2024-07-12 22:30:16.622567] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b3520 00:23:06.324 [2024-07-12 22:30:16.622580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.324 [2024-07-12 22:30:16.622934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.324 [2024-07-12 22:30:16.622951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:06.324 [2024-07-12 22:30:16.623018] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:06.324 [2024-07-12 22:30:16.623038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:06.324 [2024-07-12 22:30:16.623150] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1007ea0 00:23:06.324 [2024-07-12 22:30:16.623161] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:06.324 [2024-07-12 22:30:16.623334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x100c600 00:23:06.324 [2024-07-12 22:30:16.623463] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1007ea0 00:23:06.324 [2024-07-12 22:30:16.623473] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1007ea0 00:23:06.324 [2024-07-12 22:30:16.623568] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.324 pt4 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.324 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.583 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.583 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.583 "name": "raid_bdev1", 00:23:06.583 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:06.583 "strip_size_kb": 0, 00:23:06.583 "state": "online", 00:23:06.583 "raid_level": "raid1", 00:23:06.583 "superblock": true, 00:23:06.583 "num_base_bdevs": 4, 00:23:06.583 "num_base_bdevs_discovered": 3, 00:23:06.583 "num_base_bdevs_operational": 3, 00:23:06.583 "base_bdevs_list": [ 00:23:06.583 { 00:23:06.583 "name": null, 00:23:06.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.583 "is_configured": false, 00:23:06.583 "data_offset": 2048, 00:23:06.583 "data_size": 63488 00:23:06.583 }, 00:23:06.583 { 00:23:06.583 "name": "pt2", 00:23:06.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.583 "is_configured": true, 00:23:06.583 "data_offset": 2048, 00:23:06.583 "data_size": 63488 00:23:06.583 }, 00:23:06.583 { 00:23:06.583 "name": "pt3", 00:23:06.583 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:06.583 "is_configured": true, 00:23:06.583 "data_offset": 2048, 00:23:06.583 "data_size": 63488 00:23:06.583 }, 00:23:06.583 { 00:23:06.583 "name": "pt4", 00:23:06.583 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:06.584 "is_configured": true, 00:23:06.584 "data_offset": 2048, 00:23:06.584 "data_size": 63488 00:23:06.584 } 00:23:06.584 ] 00:23:06.584 }' 00:23:06.584 22:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.584 22:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.152 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:07.412 [2024-07-12 22:30:17.677291] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:07.412 [2024-07-12 22:30:17.677316] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.412 [2024-07-12 22:30:17.677369] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.412 [2024-07-12 22:30:17.677436] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:07.412 [2024-07-12 22:30:17.677448] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1007ea0 name raid_bdev1, state offline 00:23:07.412 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:07.412 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.671 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:07.671 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:07.671 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:23:07.671 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:23:07.671 22:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:07.931 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:08.191 [2024-07-12 22:30:18.411399] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:08.191 [2024-07-12 22:30:18.411449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.191 [2024-07-12 22:30:18.411466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b3520 00:23:08.191 [2024-07-12 22:30:18.411479] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.191 [2024-07-12 22:30:18.413098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.191 [2024-07-12 22:30:18.413127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:08.191 [2024-07-12 22:30:18.413191] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:08.191 [2024-07-12 22:30:18.413224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:08.191 [2024-07-12 22:30:18.413327] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:08.191 [2024-07-12 22:30:18.413340] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:08.191 [2024-07-12 22:30:18.413354] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1007060 name raid_bdev1, state configuring 00:23:08.191 [2024-07-12 22:30:18.413377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:08.191 [2024-07-12 22:30:18.413453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:08.191 pt1 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.191 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.192 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.192 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.451 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.451 "name": "raid_bdev1", 00:23:08.451 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:08.451 "strip_size_kb": 0, 00:23:08.451 "state": "configuring", 00:23:08.451 "raid_level": "raid1", 00:23:08.451 "superblock": true, 00:23:08.451 "num_base_bdevs": 4, 00:23:08.451 "num_base_bdevs_discovered": 2, 00:23:08.451 "num_base_bdevs_operational": 3, 00:23:08.451 "base_bdevs_list": [ 00:23:08.451 { 00:23:08.451 "name": null, 00:23:08.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.451 "is_configured": false, 00:23:08.451 "data_offset": 2048, 00:23:08.451 "data_size": 63488 00:23:08.451 }, 00:23:08.451 { 00:23:08.451 "name": "pt2", 00:23:08.451 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:08.451 "is_configured": true, 00:23:08.451 "data_offset": 2048, 00:23:08.451 "data_size": 63488 00:23:08.451 }, 00:23:08.451 { 00:23:08.451 "name": "pt3", 00:23:08.451 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:08.451 "is_configured": true, 00:23:08.451 "data_offset": 2048, 00:23:08.451 "data_size": 63488 00:23:08.451 }, 00:23:08.451 { 00:23:08.451 "name": null, 00:23:08.451 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:08.451 "is_configured": false, 00:23:08.451 "data_offset": 2048, 00:23:08.451 "data_size": 63488 00:23:08.451 } 00:23:08.451 ] 00:23:08.451 }' 00:23:08.451 22:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.451 22:30:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:09.019 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:09.019 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:09.279 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:23:09.279 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:09.538 [2024-07-12 22:30:19.746962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:09.538 [2024-07-12 22:30:19.747024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.538 [2024-07-12 22:30:19.747045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1007310 00:23:09.538 [2024-07-12 22:30:19.747058] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.538 [2024-07-12 22:30:19.747415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.538 [2024-07-12 22:30:19.747432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:09.538 [2024-07-12 22:30:19.747497] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:09.538 [2024-07-12 22:30:19.747516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:09.538 [2024-07-12 22:30:19.747628] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x100ab40 00:23:09.538 [2024-07-12 22:30:19.747638] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:09.538 [2024-07-12 22:30:19.747812] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11aa990 00:23:09.538 [2024-07-12 22:30:19.747955] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x100ab40 00:23:09.538 [2024-07-12 22:30:19.747965] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x100ab40 00:23:09.538 [2024-07-12 22:30:19.748064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.538 pt4 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.538 22:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.797 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.797 "name": "raid_bdev1", 00:23:09.797 "uuid": "16021fd1-39d5-464a-88ee-0d1a07fec421", 00:23:09.797 "strip_size_kb": 0, 00:23:09.797 "state": "online", 00:23:09.797 "raid_level": "raid1", 00:23:09.797 "superblock": true, 00:23:09.797 "num_base_bdevs": 4, 00:23:09.797 "num_base_bdevs_discovered": 3, 00:23:09.797 "num_base_bdevs_operational": 3, 00:23:09.797 "base_bdevs_list": [ 00:23:09.797 { 00:23:09.797 "name": null, 00:23:09.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.797 "is_configured": false, 00:23:09.797 "data_offset": 2048, 00:23:09.797 "data_size": 63488 00:23:09.797 }, 00:23:09.797 { 00:23:09.797 "name": "pt2", 00:23:09.797 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:09.797 "is_configured": true, 00:23:09.797 "data_offset": 2048, 00:23:09.797 "data_size": 63488 00:23:09.797 }, 00:23:09.797 { 00:23:09.797 "name": "pt3", 00:23:09.798 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:09.798 "is_configured": true, 00:23:09.798 "data_offset": 2048, 00:23:09.798 "data_size": 63488 00:23:09.798 }, 00:23:09.798 { 00:23:09.798 "name": "pt4", 00:23:09.798 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:09.798 "is_configured": true, 00:23:09.798 "data_offset": 2048, 00:23:09.798 "data_size": 63488 00:23:09.798 } 00:23:09.798 ] 00:23:09.798 }' 00:23:09.798 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.798 22:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:10.365 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:10.365 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:10.624 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:10.624 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:10.624 22:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:10.884 [2024-07-12 22:30:21.034638] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 16021fd1-39d5-464a-88ee-0d1a07fec421 '!=' 16021fd1-39d5-464a-88ee-0d1a07fec421 ']' 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 3521020 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 3521020 ']' 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 3521020 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3521020 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3521020' 00:23:10.884 killing process with pid 3521020 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 3521020 00:23:10.884 [2024-07-12 22:30:21.103976] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:10.884 [2024-07-12 22:30:21.104039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:10.884 [2024-07-12 22:30:21.104110] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:10.884 [2024-07-12 22:30:21.104124] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x100ab40 name raid_bdev1, state offline 00:23:10.884 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 3521020 00:23:10.884 [2024-07-12 22:30:21.142785] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:11.144 22:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:11.144 00:23:11.144 real 0m25.195s 00:23:11.144 user 0m46.041s 00:23:11.144 sys 0m4.562s 00:23:11.144 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:11.144 22:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:11.144 ************************************ 00:23:11.144 END TEST raid_superblock_test 00:23:11.144 ************************************ 00:23:11.144 22:30:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:11.144 22:30:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:11.144 22:30:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:11.144 22:30:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:11.144 22:30:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:11.144 ************************************ 00:23:11.144 START TEST raid_read_error_test 00:23:11.144 ************************************ 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:11.144 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.IJ5KiJfZgI 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3525288 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3525288 /var/tmp/spdk-raid.sock 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 3525288 ']' 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:11.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:11.404 22:30:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:11.404 [2024-07-12 22:30:21.531584] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:23:11.404 [2024-07-12 22:30:21.531645] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3525288 ] 00:23:11.404 [2024-07-12 22:30:21.660667] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:11.663 [2024-07-12 22:30:21.763485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:11.663 [2024-07-12 22:30:21.834688] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:11.663 [2024-07-12 22:30:21.834730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:12.603 22:30:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:12.603 22:30:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:12.603 22:30:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:12.603 22:30:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:12.862 BaseBdev1_malloc 00:23:12.862 22:30:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:13.121 true 00:23:13.121 22:30:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:13.688 [2024-07-12 22:30:23.723673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:13.688 [2024-07-12 22:30:23.723722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.688 [2024-07-12 22:30:23.723744] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f70d0 00:23:13.688 [2024-07-12 22:30:23.723758] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.688 [2024-07-12 22:30:23.725679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.688 [2024-07-12 22:30:23.725707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:13.688 BaseBdev1 00:23:13.688 22:30:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:13.688 22:30:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:13.688 BaseBdev2_malloc 00:23:13.688 22:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:14.256 true 00:23:14.256 22:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:14.516 [2024-07-12 22:30:24.752267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:14.516 [2024-07-12 22:30:24.752313] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:14.516 [2024-07-12 22:30:24.752336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fb910 00:23:14.516 [2024-07-12 22:30:24.752349] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:14.516 [2024-07-12 22:30:24.754012] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:14.516 [2024-07-12 22:30:24.754040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:14.516 BaseBdev2 00:23:14.516 22:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:14.516 22:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:15.084 BaseBdev3_malloc 00:23:15.084 22:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:15.343 true 00:23:15.343 22:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:15.912 [2024-07-12 22:30:26.008115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:15.912 [2024-07-12 22:30:26.008164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.912 [2024-07-12 22:30:26.008186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fdbd0 00:23:15.912 [2024-07-12 22:30:26.008199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.912 [2024-07-12 22:30:26.009835] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.912 [2024-07-12 22:30:26.009865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:15.912 BaseBdev3 00:23:15.912 22:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:15.912 22:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:16.480 BaseBdev4_malloc 00:23:16.480 22:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:16.739 true 00:23:16.998 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:16.998 [2024-07-12 22:30:27.293487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:16.998 [2024-07-12 22:30:27.293532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.998 [2024-07-12 22:30:27.293553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16feaa0 00:23:16.998 [2024-07-12 22:30:27.293566] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.998 [2024-07-12 22:30:27.295131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.998 [2024-07-12 22:30:27.295159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:16.998 BaseBdev4 00:23:16.998 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:17.566 [2024-07-12 22:30:27.790817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:17.566 [2024-07-12 22:30:27.792173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:17.566 [2024-07-12 22:30:27.792253] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:17.566 [2024-07-12 22:30:27.792315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:17.566 [2024-07-12 22:30:27.792554] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f8c20 00:23:17.566 [2024-07-12 22:30:27.792566] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:17.566 [2024-07-12 22:30:27.792762] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154d260 00:23:17.566 [2024-07-12 22:30:27.792924] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f8c20 00:23:17.566 [2024-07-12 22:30:27.792943] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16f8c20 00:23:17.566 [2024-07-12 22:30:27.793055] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.566 22:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.826 22:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.826 "name": "raid_bdev1", 00:23:17.826 "uuid": "47c3ef46-e674-4437-ae76-27680c321f49", 00:23:17.826 "strip_size_kb": 0, 00:23:17.826 "state": "online", 00:23:17.826 "raid_level": "raid1", 00:23:17.826 "superblock": true, 00:23:17.826 "num_base_bdevs": 4, 00:23:17.826 "num_base_bdevs_discovered": 4, 00:23:17.826 "num_base_bdevs_operational": 4, 00:23:17.826 "base_bdevs_list": [ 00:23:17.826 { 00:23:17.826 "name": "BaseBdev1", 00:23:17.826 "uuid": "0689018d-5e9b-52df-ad41-5d3386e825e8", 00:23:17.826 "is_configured": true, 00:23:17.826 "data_offset": 2048, 00:23:17.826 "data_size": 63488 00:23:17.826 }, 00:23:17.826 { 00:23:17.826 "name": "BaseBdev2", 00:23:17.826 "uuid": "eec63ac4-dd67-567d-afc1-5485af884c9a", 00:23:17.826 "is_configured": true, 00:23:17.826 "data_offset": 2048, 00:23:17.826 "data_size": 63488 00:23:17.826 }, 00:23:17.826 { 00:23:17.826 "name": "BaseBdev3", 00:23:17.826 "uuid": "69bc47b9-c89b-5b54-98e9-66ab5e4e9cec", 00:23:17.826 "is_configured": true, 00:23:17.826 "data_offset": 2048, 00:23:17.826 "data_size": 63488 00:23:17.826 }, 00:23:17.826 { 00:23:17.826 "name": "BaseBdev4", 00:23:17.826 "uuid": "e62801ab-38a4-5a74-8092-2f53116a47e4", 00:23:17.826 "is_configured": true, 00:23:17.826 "data_offset": 2048, 00:23:17.826 "data_size": 63488 00:23:17.826 } 00:23:17.826 ] 00:23:17.826 }' 00:23:17.826 22:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.826 22:30:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.394 22:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:18.395 22:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:18.654 [2024-07-12 22:30:28.741618] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154cc60 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:19.590 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.591 22:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.850 22:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.850 "name": "raid_bdev1", 00:23:19.850 "uuid": "47c3ef46-e674-4437-ae76-27680c321f49", 00:23:19.850 "strip_size_kb": 0, 00:23:19.850 "state": "online", 00:23:19.850 "raid_level": "raid1", 00:23:19.850 "superblock": true, 00:23:19.850 "num_base_bdevs": 4, 00:23:19.850 "num_base_bdevs_discovered": 4, 00:23:19.850 "num_base_bdevs_operational": 4, 00:23:19.850 "base_bdevs_list": [ 00:23:19.850 { 00:23:19.850 "name": "BaseBdev1", 00:23:19.850 "uuid": "0689018d-5e9b-52df-ad41-5d3386e825e8", 00:23:19.850 "is_configured": true, 00:23:19.850 "data_offset": 2048, 00:23:19.850 "data_size": 63488 00:23:19.850 }, 00:23:19.850 { 00:23:19.850 "name": "BaseBdev2", 00:23:19.850 "uuid": "eec63ac4-dd67-567d-afc1-5485af884c9a", 00:23:19.850 "is_configured": true, 00:23:19.850 "data_offset": 2048, 00:23:19.850 "data_size": 63488 00:23:19.850 }, 00:23:19.850 { 00:23:19.850 "name": "BaseBdev3", 00:23:19.850 "uuid": "69bc47b9-c89b-5b54-98e9-66ab5e4e9cec", 00:23:19.850 "is_configured": true, 00:23:19.850 "data_offset": 2048, 00:23:19.850 "data_size": 63488 00:23:19.850 }, 00:23:19.850 { 00:23:19.850 "name": "BaseBdev4", 00:23:19.850 "uuid": "e62801ab-38a4-5a74-8092-2f53116a47e4", 00:23:19.850 "is_configured": true, 00:23:19.850 "data_offset": 2048, 00:23:19.850 "data_size": 63488 00:23:19.850 } 00:23:19.850 ] 00:23:19.850 }' 00:23:19.850 22:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.850 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.418 22:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:20.677 [2024-07-12 22:30:30.862260] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:20.677 [2024-07-12 22:30:30.862301] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:20.677 [2024-07-12 22:30:30.865483] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:20.677 [2024-07-12 22:30:30.865521] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:20.677 [2024-07-12 22:30:30.865639] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:20.677 [2024-07-12 22:30:30.865651] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f8c20 name raid_bdev1, state offline 00:23:20.677 0 00:23:20.677 22:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3525288 00:23:20.677 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 3525288 ']' 00:23:20.677 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 3525288 00:23:20.677 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:23:20.677 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3525288 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3525288' 00:23:20.678 killing process with pid 3525288 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 3525288 00:23:20.678 [2024-07-12 22:30:30.928254] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:20.678 22:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 3525288 00:23:20.678 [2024-07-12 22:30:30.960901] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.IJ5KiJfZgI 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:20.937 00:23:20.937 real 0m9.736s 00:23:20.937 user 0m16.060s 00:23:20.937 sys 0m1.636s 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:20.937 22:30:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.937 ************************************ 00:23:20.937 END TEST raid_read_error_test 00:23:20.937 ************************************ 00:23:20.937 22:30:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:20.937 22:30:31 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:20.937 22:30:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:20.937 22:30:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:20.937 22:30:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:21.197 ************************************ 00:23:21.197 START TEST raid_write_error_test 00:23:21.197 ************************************ 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:21.197 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.f0VWmhhdFH 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=3526678 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 3526678 /var/tmp/spdk-raid.sock 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 3526678 ']' 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:21.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:21.198 22:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:21.198 [2024-07-12 22:30:31.359167] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:23:21.198 [2024-07-12 22:30:31.359227] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3526678 ] 00:23:21.198 [2024-07-12 22:30:31.472331] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:21.457 [2024-07-12 22:30:31.575512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:21.457 [2024-07-12 22:30:31.633131] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:21.457 [2024-07-12 22:30:31.633162] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:22.067 22:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:22.067 22:30:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:22.067 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:22.067 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:22.067 BaseBdev1_malloc 00:23:22.067 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:22.326 true 00:23:22.326 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:22.585 [2024-07-12 22:30:32.686056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:22.585 [2024-07-12 22:30:32.686102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.585 [2024-07-12 22:30:32.686121] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fe50d0 00:23:22.585 [2024-07-12 22:30:32.686133] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.585 [2024-07-12 22:30:32.687871] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.585 [2024-07-12 22:30:32.687899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:22.585 BaseBdev1 00:23:22.585 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:22.585 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:22.843 BaseBdev2_malloc 00:23:22.843 22:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:22.843 true 00:23:22.843 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:23.102 [2024-07-12 22:30:33.284258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:23.102 [2024-07-12 22:30:33.284301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.102 [2024-07-12 22:30:33.284320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fe9910 00:23:23.102 [2024-07-12 22:30:33.284332] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.102 [2024-07-12 22:30:33.285708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.102 [2024-07-12 22:30:33.285740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:23.102 BaseBdev2 00:23:23.102 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:23.102 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:23.361 BaseBdev3_malloc 00:23:23.361 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:23.361 true 00:23:23.361 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:23.620 [2024-07-12 22:30:33.790102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:23.620 [2024-07-12 22:30:33.790145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.620 [2024-07-12 22:30:33.790163] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1febbd0 00:23:23.620 [2024-07-12 22:30:33.790176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.620 [2024-07-12 22:30:33.791545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.620 [2024-07-12 22:30:33.791572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:23.620 BaseBdev3 00:23:23.620 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:23.620 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:23.878 BaseBdev4_malloc 00:23:23.878 22:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:24.136 true 00:23:24.136 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:24.136 [2024-07-12 22:30:34.448395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:24.136 [2024-07-12 22:30:34.448440] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.136 [2024-07-12 22:30:34.448459] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fecaa0 00:23:24.136 [2024-07-12 22:30:34.448471] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.136 [2024-07-12 22:30:34.449863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.136 [2024-07-12 22:30:34.449889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:24.136 BaseBdev4 00:23:24.395 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:24.395 [2024-07-12 22:30:34.620875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:24.395 [2024-07-12 22:30:34.622046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:24.395 [2024-07-12 22:30:34.622112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:24.395 [2024-07-12 22:30:34.622173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:24.395 [2024-07-12 22:30:34.622398] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fe6c20 00:23:24.395 [2024-07-12 22:30:34.622409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:24.395 [2024-07-12 22:30:34.622580] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e3b260 00:23:24.395 [2024-07-12 22:30:34.622729] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fe6c20 00:23:24.396 [2024-07-12 22:30:34.622744] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fe6c20 00:23:24.396 [2024-07-12 22:30:34.622842] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.396 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.655 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.655 "name": "raid_bdev1", 00:23:24.655 "uuid": "04d6766c-58a2-44c7-b303-36a8ecd4f762", 00:23:24.655 "strip_size_kb": 0, 00:23:24.655 "state": "online", 00:23:24.655 "raid_level": "raid1", 00:23:24.655 "superblock": true, 00:23:24.655 "num_base_bdevs": 4, 00:23:24.655 "num_base_bdevs_discovered": 4, 00:23:24.655 "num_base_bdevs_operational": 4, 00:23:24.655 "base_bdevs_list": [ 00:23:24.655 { 00:23:24.655 "name": "BaseBdev1", 00:23:24.655 "uuid": "594cca5d-832a-5f5a-b8d9-dc5b0cc27de3", 00:23:24.655 "is_configured": true, 00:23:24.655 "data_offset": 2048, 00:23:24.655 "data_size": 63488 00:23:24.655 }, 00:23:24.655 { 00:23:24.655 "name": "BaseBdev2", 00:23:24.655 "uuid": "dcd73587-c414-5f28-98fa-4fe4d7367a7e", 00:23:24.655 "is_configured": true, 00:23:24.655 "data_offset": 2048, 00:23:24.655 "data_size": 63488 00:23:24.655 }, 00:23:24.655 { 00:23:24.655 "name": "BaseBdev3", 00:23:24.655 "uuid": "63720b50-b282-54af-a309-f51ea97c7046", 00:23:24.655 "is_configured": true, 00:23:24.655 "data_offset": 2048, 00:23:24.655 "data_size": 63488 00:23:24.655 }, 00:23:24.655 { 00:23:24.655 "name": "BaseBdev4", 00:23:24.655 "uuid": "9c8c26b0-42df-521c-a64b-c4548e94e060", 00:23:24.655 "is_configured": true, 00:23:24.655 "data_offset": 2048, 00:23:24.655 "data_size": 63488 00:23:24.655 } 00:23:24.655 ] 00:23:24.655 }' 00:23:24.655 22:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.655 22:30:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:25.222 22:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:25.222 22:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:25.222 [2024-07-12 22:30:35.515535] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e3ac60 00:23:26.158 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:26.417 [2024-07-12 22:30:36.649427] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:26.417 [2024-07-12 22:30:36.649491] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:26.417 [2024-07-12 22:30:36.649709] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1e3ac60 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.417 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.675 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.675 "name": "raid_bdev1", 00:23:26.675 "uuid": "04d6766c-58a2-44c7-b303-36a8ecd4f762", 00:23:26.675 "strip_size_kb": 0, 00:23:26.675 "state": "online", 00:23:26.675 "raid_level": "raid1", 00:23:26.675 "superblock": true, 00:23:26.675 "num_base_bdevs": 4, 00:23:26.675 "num_base_bdevs_discovered": 3, 00:23:26.675 "num_base_bdevs_operational": 3, 00:23:26.675 "base_bdevs_list": [ 00:23:26.675 { 00:23:26.675 "name": null, 00:23:26.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.675 "is_configured": false, 00:23:26.675 "data_offset": 2048, 00:23:26.675 "data_size": 63488 00:23:26.675 }, 00:23:26.675 { 00:23:26.675 "name": "BaseBdev2", 00:23:26.675 "uuid": "dcd73587-c414-5f28-98fa-4fe4d7367a7e", 00:23:26.675 "is_configured": true, 00:23:26.675 "data_offset": 2048, 00:23:26.675 "data_size": 63488 00:23:26.675 }, 00:23:26.675 { 00:23:26.675 "name": "BaseBdev3", 00:23:26.675 "uuid": "63720b50-b282-54af-a309-f51ea97c7046", 00:23:26.675 "is_configured": true, 00:23:26.675 "data_offset": 2048, 00:23:26.675 "data_size": 63488 00:23:26.675 }, 00:23:26.675 { 00:23:26.675 "name": "BaseBdev4", 00:23:26.675 "uuid": "9c8c26b0-42df-521c-a64b-c4548e94e060", 00:23:26.675 "is_configured": true, 00:23:26.675 "data_offset": 2048, 00:23:26.675 "data_size": 63488 00:23:26.675 } 00:23:26.675 ] 00:23:26.675 }' 00:23:26.675 22:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.676 22:30:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.243 22:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:27.503 [2024-07-12 22:30:37.760465] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:27.503 [2024-07-12 22:30:37.760503] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:27.503 [2024-07-12 22:30:37.763634] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:27.503 [2024-07-12 22:30:37.763671] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.503 [2024-07-12 22:30:37.763770] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:27.503 [2024-07-12 22:30:37.763782] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fe6c20 name raid_bdev1, state offline 00:23:27.503 0 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 3526678 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 3526678 ']' 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 3526678 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:27.503 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3526678 00:23:27.762 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:27.762 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:27.762 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3526678' 00:23:27.762 killing process with pid 3526678 00:23:27.762 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 3526678 00:23:27.762 [2024-07-12 22:30:37.829481] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:27.762 22:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 3526678 00:23:27.762 [2024-07-12 22:30:37.860444] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.f0VWmhhdFH 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:27.762 22:30:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:28.021 00:23:28.021 real 0m6.802s 00:23:28.021 user 0m10.689s 00:23:28.021 sys 0m1.206s 00:23:28.021 22:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:28.021 22:30:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.021 ************************************ 00:23:28.022 END TEST raid_write_error_test 00:23:28.022 ************************************ 00:23:28.022 22:30:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:28.022 22:30:38 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:23:28.022 22:30:38 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:28.022 22:30:38 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:28.022 22:30:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:28.022 22:30:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:28.022 22:30:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:28.022 ************************************ 00:23:28.022 START TEST raid_rebuild_test 00:23:28.022 ************************************ 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:28.022 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=3527713 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 3527713 /var/tmp/spdk-raid.sock 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 3527713 ']' 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:28.023 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.023 22:30:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:28.023 [2024-07-12 22:30:38.240690] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:23:28.023 [2024-07-12 22:30:38.240757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3527713 ] 00:23:28.024 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:28.024 Zero copy mechanism will not be used. 00:23:28.283 [2024-07-12 22:30:38.370662] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:28.283 [2024-07-12 22:30:38.476425] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:28.283 [2024-07-12 22:30:38.544197] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.283 [2024-07-12 22:30:38.544241] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:29.220 22:30:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:29.220 22:30:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:29.220 22:30:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:29.220 22:30:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:29.479 BaseBdev1_malloc 00:23:29.479 22:30:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:30.048 [2024-07-12 22:30:40.170674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:30.048 [2024-07-12 22:30:40.170731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.048 [2024-07-12 22:30:40.170763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e4cd40 00:23:30.048 [2024-07-12 22:30:40.170780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.048 [2024-07-12 22:30:40.172657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.048 [2024-07-12 22:30:40.172686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:30.048 BaseBdev1 00:23:30.048 22:30:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:30.048 22:30:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:30.307 BaseBdev2_malloc 00:23:30.307 22:30:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:30.876 [2024-07-12 22:30:40.938766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:30.876 [2024-07-12 22:30:40.938815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.876 [2024-07-12 22:30:40.938847] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e4d860 00:23:30.876 [2024-07-12 22:30:40.938860] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.876 [2024-07-12 22:30:40.940456] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.876 [2024-07-12 22:30:40.940484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:30.876 BaseBdev2 00:23:30.876 22:30:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:31.152 spare_malloc 00:23:31.152 22:30:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:31.411 spare_delay 00:23:31.411 22:30:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:31.671 [2024-07-12 22:30:41.953990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:31.671 [2024-07-12 22:30:41.954041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:31.671 [2024-07-12 22:30:41.954067] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ffbec0 00:23:31.671 [2024-07-12 22:30:41.954079] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:31.671 [2024-07-12 22:30:41.955646] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:31.671 [2024-07-12 22:30:41.955674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:31.671 spare 00:23:31.671 22:30:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:32.240 [2024-07-12 22:30:42.451308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:32.240 [2024-07-12 22:30:42.452675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:32.240 [2024-07-12 22:30:42.452755] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ffd070 00:23:32.240 [2024-07-12 22:30:42.452766] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:32.240 [2024-07-12 22:30:42.452988] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff6490 00:23:32.240 [2024-07-12 22:30:42.453133] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ffd070 00:23:32.240 [2024-07-12 22:30:42.453144] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ffd070 00:23:32.240 [2024-07-12 22:30:42.453263] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.240 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.500 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.500 "name": "raid_bdev1", 00:23:32.500 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:32.500 "strip_size_kb": 0, 00:23:32.500 "state": "online", 00:23:32.500 "raid_level": "raid1", 00:23:32.500 "superblock": false, 00:23:32.500 "num_base_bdevs": 2, 00:23:32.500 "num_base_bdevs_discovered": 2, 00:23:32.500 "num_base_bdevs_operational": 2, 00:23:32.500 "base_bdevs_list": [ 00:23:32.500 { 00:23:32.500 "name": "BaseBdev1", 00:23:32.500 "uuid": "6f3de792-480c-5239-ad6e-1b313ab524de", 00:23:32.500 "is_configured": true, 00:23:32.500 "data_offset": 0, 00:23:32.500 "data_size": 65536 00:23:32.500 }, 00:23:32.500 { 00:23:32.500 "name": "BaseBdev2", 00:23:32.500 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:32.500 "is_configured": true, 00:23:32.500 "data_offset": 0, 00:23:32.500 "data_size": 65536 00:23:32.500 } 00:23:32.500 ] 00:23:32.500 }' 00:23:32.500 22:30:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.500 22:30:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.067 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:33.067 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:33.326 [2024-07-12 22:30:43.546447] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:33.326 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:33.326 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.326 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:33.586 22:30:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:33.846 [2024-07-12 22:30:44.039550] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff6490 00:23:33.846 /dev/nbd0 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:33.846 1+0 records in 00:23:33.846 1+0 records out 00:23:33.846 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270889 s, 15.1 MB/s 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:33.846 22:30:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:40.414 65536+0 records in 00:23:40.414 65536+0 records out 00:23:40.414 33554432 bytes (34 MB, 32 MiB) copied, 5.34594 s, 6.3 MB/s 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:40.414 [2024-07-12 22:30:49.660949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:40.414 [2024-07-12 22:30:49.817403] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.414 22:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.414 22:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.414 "name": "raid_bdev1", 00:23:40.414 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:40.414 "strip_size_kb": 0, 00:23:40.414 "state": "online", 00:23:40.414 "raid_level": "raid1", 00:23:40.414 "superblock": false, 00:23:40.414 "num_base_bdevs": 2, 00:23:40.414 "num_base_bdevs_discovered": 1, 00:23:40.414 "num_base_bdevs_operational": 1, 00:23:40.414 "base_bdevs_list": [ 00:23:40.414 { 00:23:40.414 "name": null, 00:23:40.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.414 "is_configured": false, 00:23:40.414 "data_offset": 0, 00:23:40.414 "data_size": 65536 00:23:40.414 }, 00:23:40.414 { 00:23:40.414 "name": "BaseBdev2", 00:23:40.414 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:40.414 "is_configured": true, 00:23:40.414 "data_offset": 0, 00:23:40.414 "data_size": 65536 00:23:40.414 } 00:23:40.414 ] 00:23:40.414 }' 00:23:40.414 22:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.414 22:30:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.414 22:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:40.673 [2024-07-12 22:30:50.900291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:40.673 [2024-07-12 22:30:50.905211] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ffd880 00:23:40.673 [2024-07-12 22:30:50.907409] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:40.673 22:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.611 22:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:41.870 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:41.870 "name": "raid_bdev1", 00:23:41.870 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:41.870 "strip_size_kb": 0, 00:23:41.870 "state": "online", 00:23:41.870 "raid_level": "raid1", 00:23:41.870 "superblock": false, 00:23:41.870 "num_base_bdevs": 2, 00:23:41.870 "num_base_bdevs_discovered": 2, 00:23:41.870 "num_base_bdevs_operational": 2, 00:23:41.870 "process": { 00:23:41.870 "type": "rebuild", 00:23:41.870 "target": "spare", 00:23:41.870 "progress": { 00:23:41.870 "blocks": 24576, 00:23:41.870 "percent": 37 00:23:41.870 } 00:23:41.870 }, 00:23:41.870 "base_bdevs_list": [ 00:23:41.870 { 00:23:41.870 "name": "spare", 00:23:41.870 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:41.870 "is_configured": true, 00:23:41.870 "data_offset": 0, 00:23:41.870 "data_size": 65536 00:23:41.870 }, 00:23:41.870 { 00:23:41.870 "name": "BaseBdev2", 00:23:41.870 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:41.870 "is_configured": true, 00:23:41.870 "data_offset": 0, 00:23:41.870 "data_size": 65536 00:23:41.870 } 00:23:41.870 ] 00:23:41.870 }' 00:23:41.870 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:42.130 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:42.130 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:42.130 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:42.130 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:42.698 [2024-07-12 22:30:52.747372] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.698 [2024-07-12 22:30:52.822518] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:42.698 [2024-07-12 22:30:52.822566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.698 [2024-07-12 22:30:52.822581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.698 [2024-07-12 22:30:52.822589] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.698 22:30:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.974 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.974 "name": "raid_bdev1", 00:23:42.974 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:42.974 "strip_size_kb": 0, 00:23:42.974 "state": "online", 00:23:42.974 "raid_level": "raid1", 00:23:42.974 "superblock": false, 00:23:42.974 "num_base_bdevs": 2, 00:23:42.974 "num_base_bdevs_discovered": 1, 00:23:42.974 "num_base_bdevs_operational": 1, 00:23:42.974 "base_bdevs_list": [ 00:23:42.974 { 00:23:42.974 "name": null, 00:23:42.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.974 "is_configured": false, 00:23:42.974 "data_offset": 0, 00:23:42.974 "data_size": 65536 00:23:42.974 }, 00:23:42.974 { 00:23:42.974 "name": "BaseBdev2", 00:23:42.974 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:42.974 "is_configured": true, 00:23:42.974 "data_offset": 0, 00:23:42.974 "data_size": 65536 00:23:42.974 } 00:23:42.974 ] 00:23:42.974 }' 00:23:42.974 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.974 22:30:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.553 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.812 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:43.812 "name": "raid_bdev1", 00:23:43.812 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:43.812 "strip_size_kb": 0, 00:23:43.812 "state": "online", 00:23:43.812 "raid_level": "raid1", 00:23:43.812 "superblock": false, 00:23:43.812 "num_base_bdevs": 2, 00:23:43.812 "num_base_bdevs_discovered": 1, 00:23:43.812 "num_base_bdevs_operational": 1, 00:23:43.812 "base_bdevs_list": [ 00:23:43.812 { 00:23:43.812 "name": null, 00:23:43.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.812 "is_configured": false, 00:23:43.812 "data_offset": 0, 00:23:43.812 "data_size": 65536 00:23:43.812 }, 00:23:43.812 { 00:23:43.812 "name": "BaseBdev2", 00:23:43.812 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:43.812 "is_configured": true, 00:23:43.813 "data_offset": 0, 00:23:43.813 "data_size": 65536 00:23:43.813 } 00:23:43.813 ] 00:23:43.813 }' 00:23:43.813 22:30:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:43.813 22:30:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:43.813 22:30:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:43.813 22:30:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:43.813 22:30:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:44.071 [2024-07-12 22:30:54.275331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:44.071 [2024-07-12 22:30:54.280279] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff6490 00:23:44.071 [2024-07-12 22:30:54.281738] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:44.071 22:30:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.008 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.268 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:45.268 "name": "raid_bdev1", 00:23:45.268 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:45.268 "strip_size_kb": 0, 00:23:45.268 "state": "online", 00:23:45.268 "raid_level": "raid1", 00:23:45.268 "superblock": false, 00:23:45.268 "num_base_bdevs": 2, 00:23:45.268 "num_base_bdevs_discovered": 2, 00:23:45.268 "num_base_bdevs_operational": 2, 00:23:45.268 "process": { 00:23:45.268 "type": "rebuild", 00:23:45.268 "target": "spare", 00:23:45.268 "progress": { 00:23:45.268 "blocks": 24576, 00:23:45.268 "percent": 37 00:23:45.268 } 00:23:45.268 }, 00:23:45.268 "base_bdevs_list": [ 00:23:45.268 { 00:23:45.268 "name": "spare", 00:23:45.269 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:45.269 "is_configured": true, 00:23:45.269 "data_offset": 0, 00:23:45.269 "data_size": 65536 00:23:45.269 }, 00:23:45.269 { 00:23:45.269 "name": "BaseBdev2", 00:23:45.269 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:45.269 "is_configured": true, 00:23:45.269 "data_offset": 0, 00:23:45.269 "data_size": 65536 00:23:45.269 } 00:23:45.269 ] 00:23:45.269 }' 00:23:45.269 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=759 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.527 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:45.787 "name": "raid_bdev1", 00:23:45.787 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:45.787 "strip_size_kb": 0, 00:23:45.787 "state": "online", 00:23:45.787 "raid_level": "raid1", 00:23:45.787 "superblock": false, 00:23:45.787 "num_base_bdevs": 2, 00:23:45.787 "num_base_bdevs_discovered": 2, 00:23:45.787 "num_base_bdevs_operational": 2, 00:23:45.787 "process": { 00:23:45.787 "type": "rebuild", 00:23:45.787 "target": "spare", 00:23:45.787 "progress": { 00:23:45.787 "blocks": 30720, 00:23:45.787 "percent": 46 00:23:45.787 } 00:23:45.787 }, 00:23:45.787 "base_bdevs_list": [ 00:23:45.787 { 00:23:45.787 "name": "spare", 00:23:45.787 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:45.787 "is_configured": true, 00:23:45.787 "data_offset": 0, 00:23:45.787 "data_size": 65536 00:23:45.787 }, 00:23:45.787 { 00:23:45.787 "name": "BaseBdev2", 00:23:45.787 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:45.787 "is_configured": true, 00:23:45.787 "data_offset": 0, 00:23:45.787 "data_size": 65536 00:23:45.787 } 00:23:45.787 ] 00:23:45.787 }' 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:45.787 22:30:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.723 22:30:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.982 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:46.982 "name": "raid_bdev1", 00:23:46.982 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:46.982 "strip_size_kb": 0, 00:23:46.982 "state": "online", 00:23:46.982 "raid_level": "raid1", 00:23:46.982 "superblock": false, 00:23:46.982 "num_base_bdevs": 2, 00:23:46.982 "num_base_bdevs_discovered": 2, 00:23:46.982 "num_base_bdevs_operational": 2, 00:23:46.982 "process": { 00:23:46.982 "type": "rebuild", 00:23:46.982 "target": "spare", 00:23:46.982 "progress": { 00:23:46.982 "blocks": 59392, 00:23:46.982 "percent": 90 00:23:46.982 } 00:23:46.982 }, 00:23:46.982 "base_bdevs_list": [ 00:23:46.982 { 00:23:46.982 "name": "spare", 00:23:46.982 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:46.982 "is_configured": true, 00:23:46.982 "data_offset": 0, 00:23:46.982 "data_size": 65536 00:23:46.982 }, 00:23:46.982 { 00:23:46.982 "name": "BaseBdev2", 00:23:46.982 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:46.982 "is_configured": true, 00:23:46.982 "data_offset": 0, 00:23:46.982 "data_size": 65536 00:23:46.982 } 00:23:46.982 ] 00:23:46.982 }' 00:23:46.982 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:46.982 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:46.982 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.241 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.241 22:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:47.241 [2024-07-12 22:30:57.506864] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:47.241 [2024-07-12 22:30:57.506931] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:47.241 [2024-07-12 22:30:57.506968] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.177 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.436 "name": "raid_bdev1", 00:23:48.436 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:48.436 "strip_size_kb": 0, 00:23:48.436 "state": "online", 00:23:48.436 "raid_level": "raid1", 00:23:48.436 "superblock": false, 00:23:48.436 "num_base_bdevs": 2, 00:23:48.436 "num_base_bdevs_discovered": 2, 00:23:48.436 "num_base_bdevs_operational": 2, 00:23:48.436 "base_bdevs_list": [ 00:23:48.436 { 00:23:48.436 "name": "spare", 00:23:48.436 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:48.436 "is_configured": true, 00:23:48.436 "data_offset": 0, 00:23:48.436 "data_size": 65536 00:23:48.436 }, 00:23:48.436 { 00:23:48.436 "name": "BaseBdev2", 00:23:48.436 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:48.436 "is_configured": true, 00:23:48.436 "data_offset": 0, 00:23:48.436 "data_size": 65536 00:23:48.436 } 00:23:48.436 ] 00:23:48.436 }' 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.436 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.695 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.695 "name": "raid_bdev1", 00:23:48.695 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:48.695 "strip_size_kb": 0, 00:23:48.695 "state": "online", 00:23:48.695 "raid_level": "raid1", 00:23:48.695 "superblock": false, 00:23:48.695 "num_base_bdevs": 2, 00:23:48.695 "num_base_bdevs_discovered": 2, 00:23:48.695 "num_base_bdevs_operational": 2, 00:23:48.695 "base_bdevs_list": [ 00:23:48.695 { 00:23:48.695 "name": "spare", 00:23:48.695 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:48.695 "is_configured": true, 00:23:48.695 "data_offset": 0, 00:23:48.695 "data_size": 65536 00:23:48.695 }, 00:23:48.695 { 00:23:48.695 "name": "BaseBdev2", 00:23:48.695 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:48.695 "is_configured": true, 00:23:48.695 "data_offset": 0, 00:23:48.695 "data_size": 65536 00:23:48.695 } 00:23:48.695 ] 00:23:48.695 }' 00:23:48.695 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.695 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:48.695 22:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.695 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.696 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.960 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.960 "name": "raid_bdev1", 00:23:48.960 "uuid": "a8d3d660-3656-415c-97a3-b33a50929304", 00:23:48.960 "strip_size_kb": 0, 00:23:48.960 "state": "online", 00:23:48.960 "raid_level": "raid1", 00:23:48.960 "superblock": false, 00:23:48.960 "num_base_bdevs": 2, 00:23:48.960 "num_base_bdevs_discovered": 2, 00:23:48.960 "num_base_bdevs_operational": 2, 00:23:48.960 "base_bdevs_list": [ 00:23:48.960 { 00:23:48.960 "name": "spare", 00:23:48.960 "uuid": "ea844ba0-b363-5504-adfb-ad92a06d8329", 00:23:48.960 "is_configured": true, 00:23:48.960 "data_offset": 0, 00:23:48.960 "data_size": 65536 00:23:48.960 }, 00:23:48.960 { 00:23:48.960 "name": "BaseBdev2", 00:23:48.960 "uuid": "b456e1a3-1668-5989-bb82-b7b210fffcfb", 00:23:48.960 "is_configured": true, 00:23:48.960 "data_offset": 0, 00:23:48.960 "data_size": 65536 00:23:48.960 } 00:23:48.960 ] 00:23:48.960 }' 00:23:48.960 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.960 22:30:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.526 22:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:49.785 [2024-07-12 22:31:00.066198] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:49.785 [2024-07-12 22:31:00.066225] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:49.785 [2024-07-12 22:31:00.066287] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:49.785 [2024-07-12 22:31:00.066347] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:49.785 [2024-07-12 22:31:00.066360] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffd070 name raid_bdev1, state offline 00:23:49.785 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.785 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:50.044 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:50.611 /dev/nbd0 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:50.611 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:50.612 1+0 records in 00:23:50.612 1+0 records out 00:23:50.612 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264319 s, 15.5 MB/s 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:50.612 22:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:50.871 /dev/nbd1 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:50.871 1+0 records in 00:23:50.871 1+0 records out 00:23:50.871 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331889 s, 12.3 MB/s 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:50.871 22:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:51.130 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:51.388 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:51.388 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:51.389 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 3527713 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 3527713 ']' 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 3527713 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3527713 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3527713' 00:23:51.648 killing process with pid 3527713 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 3527713 00:23:51.648 Received shutdown signal, test time was about 60.000000 seconds 00:23:51.648 00:23:51.648 Latency(us) 00:23:51.648 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:51.648 =================================================================================================================== 00:23:51.648 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:51.648 [2024-07-12 22:31:01.859825] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:51.648 22:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 3527713 00:23:51.648 [2024-07-12 22:31:01.885735] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:51.907 22:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:51.907 00:23:51.907 real 0m23.915s 00:23:51.907 user 0m33.065s 00:23:51.907 sys 0m5.229s 00:23:51.907 22:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:51.907 22:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:51.907 ************************************ 00:23:51.907 END TEST raid_rebuild_test 00:23:51.907 ************************************ 00:23:51.908 22:31:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:51.908 22:31:02 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:51.908 22:31:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:51.908 22:31:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:51.908 22:31:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:51.908 ************************************ 00:23:51.908 START TEST raid_rebuild_test_sb 00:23:51.908 ************************************ 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=3530976 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 3530976 /var/tmp/spdk-raid.sock 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3530976 ']' 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:51.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:51.908 22:31:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:52.168 [2024-07-12 22:31:02.245952] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:23:52.168 [2024-07-12 22:31:02.246021] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3530976 ] 00:23:52.168 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:52.168 Zero copy mechanism will not be used. 00:23:52.168 [2024-07-12 22:31:02.375563] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:52.168 [2024-07-12 22:31:02.480043] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:52.428 [2024-07-12 22:31:02.554510] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:52.428 [2024-07-12 22:31:02.554549] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:53.023 22:31:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:53.023 22:31:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:53.023 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:53.023 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:53.281 BaseBdev1_malloc 00:23:53.282 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:53.282 [2024-07-12 22:31:03.586450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:53.282 [2024-07-12 22:31:03.586495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.282 [2024-07-12 22:31:03.586519] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe9d40 00:23:53.282 [2024-07-12 22:31:03.586531] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.282 [2024-07-12 22:31:03.588176] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.282 [2024-07-12 22:31:03.588205] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:53.282 BaseBdev1 00:23:53.540 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:53.540 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:53.540 BaseBdev2_malloc 00:23:53.799 22:31:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:53.799 [2024-07-12 22:31:04.104671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:53.799 [2024-07-12 22:31:04.104715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.799 [2024-07-12 22:31:04.104739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbea860 00:23:53.799 [2024-07-12 22:31:04.104752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.799 [2024-07-12 22:31:04.106206] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.799 [2024-07-12 22:31:04.106233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:53.799 BaseBdev2 00:23:54.058 22:31:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:54.058 spare_malloc 00:23:54.317 22:31:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:54.317 spare_delay 00:23:54.576 22:31:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:54.576 [2024-07-12 22:31:04.871279] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:54.576 [2024-07-12 22:31:04.871324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.576 [2024-07-12 22:31:04.871345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd98ec0 00:23:54.576 [2024-07-12 22:31:04.871357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.576 [2024-07-12 22:31:04.872951] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.576 [2024-07-12 22:31:04.872978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:54.576 spare 00:23:54.576 22:31:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:54.836 [2024-07-12 22:31:05.115962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.836 [2024-07-12 22:31:05.117304] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:54.836 [2024-07-12 22:31:05.117472] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd9a070 00:23:54.836 [2024-07-12 22:31:05.117485] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:54.836 [2024-07-12 22:31:05.117679] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd93490 00:23:54.836 [2024-07-12 22:31:05.117821] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd9a070 00:23:54.836 [2024-07-12 22:31:05.117832] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd9a070 00:23:54.836 [2024-07-12 22:31:05.117937] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.836 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.097 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.097 "name": "raid_bdev1", 00:23:55.097 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:23:55.097 "strip_size_kb": 0, 00:23:55.097 "state": "online", 00:23:55.097 "raid_level": "raid1", 00:23:55.097 "superblock": true, 00:23:55.097 "num_base_bdevs": 2, 00:23:55.097 "num_base_bdevs_discovered": 2, 00:23:55.097 "num_base_bdevs_operational": 2, 00:23:55.097 "base_bdevs_list": [ 00:23:55.097 { 00:23:55.097 "name": "BaseBdev1", 00:23:55.097 "uuid": "669d1f8f-c5e4-5b32-8ac3-49a4360a29d5", 00:23:55.097 "is_configured": true, 00:23:55.097 "data_offset": 2048, 00:23:55.097 "data_size": 63488 00:23:55.097 }, 00:23:55.097 { 00:23:55.097 "name": "BaseBdev2", 00:23:55.097 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:23:55.097 "is_configured": true, 00:23:55.097 "data_offset": 2048, 00:23:55.097 "data_size": 63488 00:23:55.097 } 00:23:55.097 ] 00:23:55.097 }' 00:23:55.097 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.097 22:31:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.035 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:56.035 22:31:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:56.035 [2024-07-12 22:31:06.134893] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:56.035 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:56.035 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.035 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:56.294 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:56.863 [2024-07-12 22:31:06.888690] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd93490 00:23:56.863 /dev/nbd0 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:56.863 1+0 records in 00:23:56.863 1+0 records out 00:23:56.863 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252574 s, 16.2 MB/s 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:56.863 22:31:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:02.138 63488+0 records in 00:24:02.138 63488+0 records out 00:24:02.138 32505856 bytes (33 MB, 31 MiB) copied, 4.9964 s, 6.5 MB/s 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:02.138 22:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:02.138 [2024-07-12 22:31:12.244444] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:02.138 [2024-07-12 22:31:12.408935] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.138 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.395 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.395 "name": "raid_bdev1", 00:24:02.395 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:02.395 "strip_size_kb": 0, 00:24:02.396 "state": "online", 00:24:02.396 "raid_level": "raid1", 00:24:02.396 "superblock": true, 00:24:02.396 "num_base_bdevs": 2, 00:24:02.396 "num_base_bdevs_discovered": 1, 00:24:02.396 "num_base_bdevs_operational": 1, 00:24:02.396 "base_bdevs_list": [ 00:24:02.396 { 00:24:02.396 "name": null, 00:24:02.396 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.396 "is_configured": false, 00:24:02.396 "data_offset": 2048, 00:24:02.396 "data_size": 63488 00:24:02.396 }, 00:24:02.396 { 00:24:02.396 "name": "BaseBdev2", 00:24:02.396 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:02.396 "is_configured": true, 00:24:02.396 "data_offset": 2048, 00:24:02.396 "data_size": 63488 00:24:02.396 } 00:24:02.396 ] 00:24:02.396 }' 00:24:02.396 22:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.396 22:31:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:02.967 22:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:03.268 [2024-07-12 22:31:13.483776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:03.268 [2024-07-12 22:31:13.488698] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd99ce0 00:24:03.268 [2024-07-12 22:31:13.490940] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:03.268 22:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.215 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.476 "name": "raid_bdev1", 00:24:04.476 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:04.476 "strip_size_kb": 0, 00:24:04.476 "state": "online", 00:24:04.476 "raid_level": "raid1", 00:24:04.476 "superblock": true, 00:24:04.476 "num_base_bdevs": 2, 00:24:04.476 "num_base_bdevs_discovered": 2, 00:24:04.476 "num_base_bdevs_operational": 2, 00:24:04.476 "process": { 00:24:04.476 "type": "rebuild", 00:24:04.476 "target": "spare", 00:24:04.476 "progress": { 00:24:04.476 "blocks": 22528, 00:24:04.476 "percent": 35 00:24:04.476 } 00:24:04.476 }, 00:24:04.476 "base_bdevs_list": [ 00:24:04.476 { 00:24:04.476 "name": "spare", 00:24:04.476 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:04.476 "is_configured": true, 00:24:04.476 "data_offset": 2048, 00:24:04.476 "data_size": 63488 00:24:04.476 }, 00:24:04.476 { 00:24:04.476 "name": "BaseBdev2", 00:24:04.476 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:04.476 "is_configured": true, 00:24:04.476 "data_offset": 2048, 00:24:04.476 "data_size": 63488 00:24:04.476 } 00:24:04.476 ] 00:24:04.476 }' 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:04.476 22:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:04.735 [2024-07-12 22:31:15.013026] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.995 [2024-07-12 22:31:15.103204] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:04.995 [2024-07-12 22:31:15.103256] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.995 [2024-07-12 22:31:15.103271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.995 [2024-07-12 22:31:15.103280] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.995 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.254 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.254 "name": "raid_bdev1", 00:24:05.254 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:05.254 "strip_size_kb": 0, 00:24:05.254 "state": "online", 00:24:05.254 "raid_level": "raid1", 00:24:05.254 "superblock": true, 00:24:05.254 "num_base_bdevs": 2, 00:24:05.254 "num_base_bdevs_discovered": 1, 00:24:05.254 "num_base_bdevs_operational": 1, 00:24:05.254 "base_bdevs_list": [ 00:24:05.254 { 00:24:05.254 "name": null, 00:24:05.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.254 "is_configured": false, 00:24:05.254 "data_offset": 2048, 00:24:05.254 "data_size": 63488 00:24:05.254 }, 00:24:05.254 { 00:24:05.254 "name": "BaseBdev2", 00:24:05.254 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:05.254 "is_configured": true, 00:24:05.254 "data_offset": 2048, 00:24:05.254 "data_size": 63488 00:24:05.254 } 00:24:05.254 ] 00:24:05.254 }' 00:24:05.254 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.254 22:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.823 22:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.082 "name": "raid_bdev1", 00:24:06.082 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:06.082 "strip_size_kb": 0, 00:24:06.082 "state": "online", 00:24:06.082 "raid_level": "raid1", 00:24:06.082 "superblock": true, 00:24:06.082 "num_base_bdevs": 2, 00:24:06.082 "num_base_bdevs_discovered": 1, 00:24:06.082 "num_base_bdevs_operational": 1, 00:24:06.082 "base_bdevs_list": [ 00:24:06.082 { 00:24:06.082 "name": null, 00:24:06.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.082 "is_configured": false, 00:24:06.082 "data_offset": 2048, 00:24:06.082 "data_size": 63488 00:24:06.082 }, 00:24:06.082 { 00:24:06.082 "name": "BaseBdev2", 00:24:06.082 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:06.082 "is_configured": true, 00:24:06.082 "data_offset": 2048, 00:24:06.082 "data_size": 63488 00:24:06.082 } 00:24:06.082 ] 00:24:06.082 }' 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:06.082 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.342 [2024-07-12 22:31:16.564173] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.342 [2024-07-12 22:31:16.569549] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd99ce0 00:24:06.342 [2024-07-12 22:31:16.571048] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:06.342 22:31:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.279 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.539 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.539 "name": "raid_bdev1", 00:24:07.539 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:07.539 "strip_size_kb": 0, 00:24:07.539 "state": "online", 00:24:07.539 "raid_level": "raid1", 00:24:07.539 "superblock": true, 00:24:07.539 "num_base_bdevs": 2, 00:24:07.539 "num_base_bdevs_discovered": 2, 00:24:07.539 "num_base_bdevs_operational": 2, 00:24:07.539 "process": { 00:24:07.539 "type": "rebuild", 00:24:07.539 "target": "spare", 00:24:07.539 "progress": { 00:24:07.539 "blocks": 24576, 00:24:07.539 "percent": 38 00:24:07.539 } 00:24:07.539 }, 00:24:07.539 "base_bdevs_list": [ 00:24:07.539 { 00:24:07.539 "name": "spare", 00:24:07.539 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:07.539 "is_configured": true, 00:24:07.539 "data_offset": 2048, 00:24:07.539 "data_size": 63488 00:24:07.539 }, 00:24:07.539 { 00:24:07.539 "name": "BaseBdev2", 00:24:07.540 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:07.540 "is_configured": true, 00:24:07.540 "data_offset": 2048, 00:24:07.540 "data_size": 63488 00:24:07.540 } 00:24:07.540 ] 00:24:07.540 }' 00:24:07.540 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:07.799 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=781 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.799 22:31:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.058 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.059 "name": "raid_bdev1", 00:24:08.059 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:08.059 "strip_size_kb": 0, 00:24:08.059 "state": "online", 00:24:08.059 "raid_level": "raid1", 00:24:08.059 "superblock": true, 00:24:08.059 "num_base_bdevs": 2, 00:24:08.059 "num_base_bdevs_discovered": 2, 00:24:08.059 "num_base_bdevs_operational": 2, 00:24:08.059 "process": { 00:24:08.059 "type": "rebuild", 00:24:08.059 "target": "spare", 00:24:08.059 "progress": { 00:24:08.059 "blocks": 30720, 00:24:08.059 "percent": 48 00:24:08.059 } 00:24:08.059 }, 00:24:08.059 "base_bdevs_list": [ 00:24:08.059 { 00:24:08.059 "name": "spare", 00:24:08.059 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:08.059 "is_configured": true, 00:24:08.059 "data_offset": 2048, 00:24:08.059 "data_size": 63488 00:24:08.059 }, 00:24:08.059 { 00:24:08.059 "name": "BaseBdev2", 00:24:08.059 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:08.059 "is_configured": true, 00:24:08.059 "data_offset": 2048, 00:24:08.059 "data_size": 63488 00:24:08.059 } 00:24:08.059 ] 00:24:08.059 }' 00:24:08.059 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.059 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:08.059 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.059 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:08.059 22:31:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.998 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.258 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.258 "name": "raid_bdev1", 00:24:09.258 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:09.258 "strip_size_kb": 0, 00:24:09.258 "state": "online", 00:24:09.258 "raid_level": "raid1", 00:24:09.258 "superblock": true, 00:24:09.258 "num_base_bdevs": 2, 00:24:09.258 "num_base_bdevs_discovered": 2, 00:24:09.258 "num_base_bdevs_operational": 2, 00:24:09.258 "process": { 00:24:09.258 "type": "rebuild", 00:24:09.258 "target": "spare", 00:24:09.258 "progress": { 00:24:09.258 "blocks": 59392, 00:24:09.258 "percent": 93 00:24:09.258 } 00:24:09.258 }, 00:24:09.258 "base_bdevs_list": [ 00:24:09.258 { 00:24:09.258 "name": "spare", 00:24:09.258 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:09.258 "is_configured": true, 00:24:09.258 "data_offset": 2048, 00:24:09.258 "data_size": 63488 00:24:09.258 }, 00:24:09.258 { 00:24:09.258 "name": "BaseBdev2", 00:24:09.258 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:09.258 "is_configured": true, 00:24:09.258 "data_offset": 2048, 00:24:09.258 "data_size": 63488 00:24:09.258 } 00:24:09.258 ] 00:24:09.258 }' 00:24:09.258 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.258 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.258 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.517 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.517 22:31:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:09.517 [2024-07-12 22:31:19.696075] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:09.517 [2024-07-12 22:31:19.696135] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:09.517 [2024-07-12 22:31:19.696216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:10.453 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:10.453 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.454 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.713 "name": "raid_bdev1", 00:24:10.713 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:10.713 "strip_size_kb": 0, 00:24:10.713 "state": "online", 00:24:10.713 "raid_level": "raid1", 00:24:10.713 "superblock": true, 00:24:10.713 "num_base_bdevs": 2, 00:24:10.713 "num_base_bdevs_discovered": 2, 00:24:10.713 "num_base_bdevs_operational": 2, 00:24:10.713 "base_bdevs_list": [ 00:24:10.713 { 00:24:10.713 "name": "spare", 00:24:10.713 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:10.713 "is_configured": true, 00:24:10.713 "data_offset": 2048, 00:24:10.713 "data_size": 63488 00:24:10.713 }, 00:24:10.713 { 00:24:10.713 "name": "BaseBdev2", 00:24:10.713 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:10.713 "is_configured": true, 00:24:10.713 "data_offset": 2048, 00:24:10.713 "data_size": 63488 00:24:10.713 } 00:24:10.713 ] 00:24:10.713 }' 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.713 22:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.973 "name": "raid_bdev1", 00:24:10.973 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:10.973 "strip_size_kb": 0, 00:24:10.973 "state": "online", 00:24:10.973 "raid_level": "raid1", 00:24:10.973 "superblock": true, 00:24:10.973 "num_base_bdevs": 2, 00:24:10.973 "num_base_bdevs_discovered": 2, 00:24:10.973 "num_base_bdevs_operational": 2, 00:24:10.973 "base_bdevs_list": [ 00:24:10.973 { 00:24:10.973 "name": "spare", 00:24:10.973 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:10.973 "is_configured": true, 00:24:10.973 "data_offset": 2048, 00:24:10.973 "data_size": 63488 00:24:10.973 }, 00:24:10.973 { 00:24:10.973 "name": "BaseBdev2", 00:24:10.973 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:10.973 "is_configured": true, 00:24:10.973 "data_offset": 2048, 00:24:10.973 "data_size": 63488 00:24:10.973 } 00:24:10.973 ] 00:24:10.973 }' 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.973 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.232 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.232 "name": "raid_bdev1", 00:24:11.232 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:11.232 "strip_size_kb": 0, 00:24:11.232 "state": "online", 00:24:11.232 "raid_level": "raid1", 00:24:11.232 "superblock": true, 00:24:11.232 "num_base_bdevs": 2, 00:24:11.232 "num_base_bdevs_discovered": 2, 00:24:11.232 "num_base_bdevs_operational": 2, 00:24:11.232 "base_bdevs_list": [ 00:24:11.232 { 00:24:11.232 "name": "spare", 00:24:11.232 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:11.232 "is_configured": true, 00:24:11.232 "data_offset": 2048, 00:24:11.232 "data_size": 63488 00:24:11.232 }, 00:24:11.232 { 00:24:11.232 "name": "BaseBdev2", 00:24:11.232 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:11.232 "is_configured": true, 00:24:11.232 "data_offset": 2048, 00:24:11.232 "data_size": 63488 00:24:11.232 } 00:24:11.232 ] 00:24:11.232 }' 00:24:11.232 22:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.232 22:31:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:11.799 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:12.057 [2024-07-12 22:31:22.331726] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:12.057 [2024-07-12 22:31:22.331754] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:12.057 [2024-07-12 22:31:22.331816] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:12.057 [2024-07-12 22:31:22.331873] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:12.057 [2024-07-12 22:31:22.331885] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd9a070 name raid_bdev1, state offline 00:24:12.057 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.057 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:12.316 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:12.575 /dev/nbd0 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:12.575 1+0 records in 00:24:12.575 1+0 records out 00:24:12.575 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000159665 s, 25.7 MB/s 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:12.575 22:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:12.835 /dev/nbd1 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:12.835 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:13.094 1+0 records in 00:24:13.094 1+0 records out 00:24:13.094 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000358904 s, 11.4 MB/s 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:13.094 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:13.353 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:13.622 22:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:13.889 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:14.147 [2024-07-12 22:31:24.306061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:14.147 [2024-07-12 22:31:24.306108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:14.147 [2024-07-12 22:31:24.306133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd96fe0 00:24:14.147 [2024-07-12 22:31:24.306146] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:14.147 [2024-07-12 22:31:24.307769] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:14.147 [2024-07-12 22:31:24.307799] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:14.147 [2024-07-12 22:31:24.307879] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:14.147 [2024-07-12 22:31:24.307905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:14.147 [2024-07-12 22:31:24.308015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:14.147 spare 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.147 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.147 [2024-07-12 22:31:24.408329] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd98260 00:24:14.147 [2024-07-12 22:31:24.408347] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:14.147 [2024-07-12 22:31:24.408546] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd92f50 00:24:14.148 [2024-07-12 22:31:24.408695] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd98260 00:24:14.148 [2024-07-12 22:31:24.408706] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd98260 00:24:14.148 [2024-07-12 22:31:24.408811] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:14.407 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.407 "name": "raid_bdev1", 00:24:14.407 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:14.407 "strip_size_kb": 0, 00:24:14.407 "state": "online", 00:24:14.407 "raid_level": "raid1", 00:24:14.407 "superblock": true, 00:24:14.407 "num_base_bdevs": 2, 00:24:14.407 "num_base_bdevs_discovered": 2, 00:24:14.407 "num_base_bdevs_operational": 2, 00:24:14.407 "base_bdevs_list": [ 00:24:14.407 { 00:24:14.407 "name": "spare", 00:24:14.407 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:14.407 "is_configured": true, 00:24:14.407 "data_offset": 2048, 00:24:14.407 "data_size": 63488 00:24:14.407 }, 00:24:14.407 { 00:24:14.407 "name": "BaseBdev2", 00:24:14.407 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:14.407 "is_configured": true, 00:24:14.407 "data_offset": 2048, 00:24:14.407 "data_size": 63488 00:24:14.407 } 00:24:14.407 ] 00:24:14.407 }' 00:24:14.407 22:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.407 22:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.975 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.235 "name": "raid_bdev1", 00:24:15.235 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:15.235 "strip_size_kb": 0, 00:24:15.235 "state": "online", 00:24:15.235 "raid_level": "raid1", 00:24:15.235 "superblock": true, 00:24:15.235 "num_base_bdevs": 2, 00:24:15.235 "num_base_bdevs_discovered": 2, 00:24:15.235 "num_base_bdevs_operational": 2, 00:24:15.235 "base_bdevs_list": [ 00:24:15.235 { 00:24:15.235 "name": "spare", 00:24:15.235 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:15.235 "is_configured": true, 00:24:15.235 "data_offset": 2048, 00:24:15.235 "data_size": 63488 00:24:15.235 }, 00:24:15.235 { 00:24:15.235 "name": "BaseBdev2", 00:24:15.235 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:15.235 "is_configured": true, 00:24:15.235 "data_offset": 2048, 00:24:15.235 "data_size": 63488 00:24:15.235 } 00:24:15.235 ] 00:24:15.235 }' 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.235 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:15.494 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.494 22:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:15.753 [2024-07-12 22:31:25.994676] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.753 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.012 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.012 "name": "raid_bdev1", 00:24:16.012 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:16.012 "strip_size_kb": 0, 00:24:16.012 "state": "online", 00:24:16.012 "raid_level": "raid1", 00:24:16.012 "superblock": true, 00:24:16.012 "num_base_bdevs": 2, 00:24:16.012 "num_base_bdevs_discovered": 1, 00:24:16.012 "num_base_bdevs_operational": 1, 00:24:16.012 "base_bdevs_list": [ 00:24:16.012 { 00:24:16.012 "name": null, 00:24:16.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.012 "is_configured": false, 00:24:16.012 "data_offset": 2048, 00:24:16.012 "data_size": 63488 00:24:16.012 }, 00:24:16.012 { 00:24:16.012 "name": "BaseBdev2", 00:24:16.012 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:16.012 "is_configured": true, 00:24:16.012 "data_offset": 2048, 00:24:16.012 "data_size": 63488 00:24:16.012 } 00:24:16.012 ] 00:24:16.012 }' 00:24:16.012 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.012 22:31:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:16.580 22:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:16.839 [2024-07-12 22:31:27.089607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:16.839 [2024-07-12 22:31:27.089767] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:16.840 [2024-07-12 22:31:27.089784] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:16.840 [2024-07-12 22:31:27.089812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:16.840 [2024-07-12 22:31:27.095323] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd92f50 00:24:16.840 [2024-07-12 22:31:27.097729] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:16.840 22:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.239 "name": "raid_bdev1", 00:24:18.239 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:18.239 "strip_size_kb": 0, 00:24:18.239 "state": "online", 00:24:18.239 "raid_level": "raid1", 00:24:18.239 "superblock": true, 00:24:18.239 "num_base_bdevs": 2, 00:24:18.239 "num_base_bdevs_discovered": 2, 00:24:18.239 "num_base_bdevs_operational": 2, 00:24:18.239 "process": { 00:24:18.239 "type": "rebuild", 00:24:18.239 "target": "spare", 00:24:18.239 "progress": { 00:24:18.239 "blocks": 24576, 00:24:18.239 "percent": 38 00:24:18.239 } 00:24:18.239 }, 00:24:18.239 "base_bdevs_list": [ 00:24:18.239 { 00:24:18.239 "name": "spare", 00:24:18.239 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:18.239 "is_configured": true, 00:24:18.239 "data_offset": 2048, 00:24:18.239 "data_size": 63488 00:24:18.239 }, 00:24:18.239 { 00:24:18.239 "name": "BaseBdev2", 00:24:18.239 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:18.239 "is_configured": true, 00:24:18.239 "data_offset": 2048, 00:24:18.239 "data_size": 63488 00:24:18.239 } 00:24:18.239 ] 00:24:18.239 }' 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:18.239 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:18.499 [2024-07-12 22:31:28.684025] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:18.499 [2024-07-12 22:31:28.710310] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:18.499 [2024-07-12 22:31:28.710355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.499 [2024-07-12 22:31:28.710370] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:18.499 [2024-07-12 22:31:28.710378] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.499 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.758 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:18.758 "name": "raid_bdev1", 00:24:18.758 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:18.758 "strip_size_kb": 0, 00:24:18.758 "state": "online", 00:24:18.758 "raid_level": "raid1", 00:24:18.758 "superblock": true, 00:24:18.758 "num_base_bdevs": 2, 00:24:18.758 "num_base_bdevs_discovered": 1, 00:24:18.758 "num_base_bdevs_operational": 1, 00:24:18.758 "base_bdevs_list": [ 00:24:18.758 { 00:24:18.758 "name": null, 00:24:18.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.758 "is_configured": false, 00:24:18.758 "data_offset": 2048, 00:24:18.758 "data_size": 63488 00:24:18.758 }, 00:24:18.758 { 00:24:18.758 "name": "BaseBdev2", 00:24:18.758 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:18.758 "is_configured": true, 00:24:18.758 "data_offset": 2048, 00:24:18.758 "data_size": 63488 00:24:18.758 } 00:24:18.758 ] 00:24:18.758 }' 00:24:18.758 22:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:18.758 22:31:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:19.324 22:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:19.582 [2024-07-12 22:31:29.809702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:19.582 [2024-07-12 22:31:29.809751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.582 [2024-07-12 22:31:29.809775] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9a2f0 00:24:19.582 [2024-07-12 22:31:29.809787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.582 [2024-07-12 22:31:29.810162] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.582 [2024-07-12 22:31:29.810181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:19.582 [2024-07-12 22:31:29.810260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:19.582 [2024-07-12 22:31:29.810273] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:19.582 [2024-07-12 22:31:29.810291] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:19.582 [2024-07-12 22:31:29.810310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:19.582 [2024-07-12 22:31:29.815170] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd92f50 00:24:19.582 spare 00:24:19.582 [2024-07-12 22:31:29.816637] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:19.582 22:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.519 22:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:20.833 "name": "raid_bdev1", 00:24:20.833 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:20.833 "strip_size_kb": 0, 00:24:20.833 "state": "online", 00:24:20.833 "raid_level": "raid1", 00:24:20.833 "superblock": true, 00:24:20.833 "num_base_bdevs": 2, 00:24:20.833 "num_base_bdevs_discovered": 2, 00:24:20.833 "num_base_bdevs_operational": 2, 00:24:20.833 "process": { 00:24:20.833 "type": "rebuild", 00:24:20.833 "target": "spare", 00:24:20.833 "progress": { 00:24:20.833 "blocks": 22528, 00:24:20.833 "percent": 35 00:24:20.833 } 00:24:20.833 }, 00:24:20.833 "base_bdevs_list": [ 00:24:20.833 { 00:24:20.833 "name": "spare", 00:24:20.833 "uuid": "885e552d-4fb6-5e6c-bf5c-cd416eb156f4", 00:24:20.833 "is_configured": true, 00:24:20.833 "data_offset": 2048, 00:24:20.833 "data_size": 63488 00:24:20.833 }, 00:24:20.833 { 00:24:20.833 "name": "BaseBdev2", 00:24:20.833 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:20.833 "is_configured": true, 00:24:20.833 "data_offset": 2048, 00:24:20.833 "data_size": 63488 00:24:20.833 } 00:24:20.833 ] 00:24:20.833 }' 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:20.833 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:21.100 [2024-07-12 22:31:31.335688] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.360 [2024-07-12 22:31:31.429430] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:21.360 [2024-07-12 22:31:31.429475] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.360 [2024-07-12 22:31:31.429490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.360 [2024-07-12 22:31:31.429499] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.360 "name": "raid_bdev1", 00:24:21.360 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:21.360 "strip_size_kb": 0, 00:24:21.360 "state": "online", 00:24:21.360 "raid_level": "raid1", 00:24:21.360 "superblock": true, 00:24:21.360 "num_base_bdevs": 2, 00:24:21.360 "num_base_bdevs_discovered": 1, 00:24:21.360 "num_base_bdevs_operational": 1, 00:24:21.360 "base_bdevs_list": [ 00:24:21.360 { 00:24:21.360 "name": null, 00:24:21.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:21.360 "is_configured": false, 00:24:21.360 "data_offset": 2048, 00:24:21.360 "data_size": 63488 00:24:21.360 }, 00:24:21.360 { 00:24:21.360 "name": "BaseBdev2", 00:24:21.360 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:21.360 "is_configured": true, 00:24:21.360 "data_offset": 2048, 00:24:21.360 "data_size": 63488 00:24:21.360 } 00:24:21.360 ] 00:24:21.360 }' 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.360 22:31:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.929 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.188 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.188 "name": "raid_bdev1", 00:24:22.188 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:22.188 "strip_size_kb": 0, 00:24:22.188 "state": "online", 00:24:22.188 "raid_level": "raid1", 00:24:22.188 "superblock": true, 00:24:22.188 "num_base_bdevs": 2, 00:24:22.188 "num_base_bdevs_discovered": 1, 00:24:22.188 "num_base_bdevs_operational": 1, 00:24:22.188 "base_bdevs_list": [ 00:24:22.188 { 00:24:22.188 "name": null, 00:24:22.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.188 "is_configured": false, 00:24:22.188 "data_offset": 2048, 00:24:22.188 "data_size": 63488 00:24:22.188 }, 00:24:22.188 { 00:24:22.188 "name": "BaseBdev2", 00:24:22.188 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:22.188 "is_configured": true, 00:24:22.188 "data_offset": 2048, 00:24:22.188 "data_size": 63488 00:24:22.188 } 00:24:22.188 ] 00:24:22.188 }' 00:24:22.188 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.447 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:22.447 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.447 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:22.447 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:22.707 22:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:22.977 [2024-07-12 22:31:33.038772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:22.977 [2024-07-12 22:31:33.038819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.977 [2024-07-12 22:31:33.038839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd96ad0 00:24:22.977 [2024-07-12 22:31:33.038870] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.977 [2024-07-12 22:31:33.039216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.977 [2024-07-12 22:31:33.039235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:22.977 [2024-07-12 22:31:33.039299] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:22.977 [2024-07-12 22:31:33.039311] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:22.977 [2024-07-12 22:31:33.039321] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:22.977 BaseBdev1 00:24:22.977 22:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.915 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.174 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.174 "name": "raid_bdev1", 00:24:24.174 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:24.174 "strip_size_kb": 0, 00:24:24.174 "state": "online", 00:24:24.174 "raid_level": "raid1", 00:24:24.174 "superblock": true, 00:24:24.174 "num_base_bdevs": 2, 00:24:24.174 "num_base_bdevs_discovered": 1, 00:24:24.174 "num_base_bdevs_operational": 1, 00:24:24.174 "base_bdevs_list": [ 00:24:24.174 { 00:24:24.174 "name": null, 00:24:24.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.174 "is_configured": false, 00:24:24.174 "data_offset": 2048, 00:24:24.174 "data_size": 63488 00:24:24.174 }, 00:24:24.174 { 00:24:24.174 "name": "BaseBdev2", 00:24:24.174 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:24.174 "is_configured": true, 00:24:24.174 "data_offset": 2048, 00:24:24.174 "data_size": 63488 00:24:24.174 } 00:24:24.174 ] 00:24:24.174 }' 00:24:24.174 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.174 22:31:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.743 22:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.002 "name": "raid_bdev1", 00:24:25.002 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:25.002 "strip_size_kb": 0, 00:24:25.002 "state": "online", 00:24:25.002 "raid_level": "raid1", 00:24:25.002 "superblock": true, 00:24:25.002 "num_base_bdevs": 2, 00:24:25.002 "num_base_bdevs_discovered": 1, 00:24:25.002 "num_base_bdevs_operational": 1, 00:24:25.002 "base_bdevs_list": [ 00:24:25.002 { 00:24:25.002 "name": null, 00:24:25.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.002 "is_configured": false, 00:24:25.002 "data_offset": 2048, 00:24:25.002 "data_size": 63488 00:24:25.002 }, 00:24:25.002 { 00:24:25.002 "name": "BaseBdev2", 00:24:25.002 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:25.002 "is_configured": true, 00:24:25.002 "data_offset": 2048, 00:24:25.002 "data_size": 63488 00:24:25.002 } 00:24:25.002 ] 00:24:25.002 }' 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.002 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:25.003 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:25.262 [2024-07-12 22:31:35.481293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:25.262 [2024-07-12 22:31:35.481425] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:25.262 [2024-07-12 22:31:35.481442] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:25.262 request: 00:24:25.262 { 00:24:25.262 "base_bdev": "BaseBdev1", 00:24:25.262 "raid_bdev": "raid_bdev1", 00:24:25.262 "method": "bdev_raid_add_base_bdev", 00:24:25.262 "req_id": 1 00:24:25.262 } 00:24:25.262 Got JSON-RPC error response 00:24:25.262 response: 00:24:25.262 { 00:24:25.262 "code": -22, 00:24:25.262 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:25.262 } 00:24:25.262 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:25.262 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:25.262 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:25.262 22:31:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:25.262 22:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.200 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.459 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.459 "name": "raid_bdev1", 00:24:26.459 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:26.459 "strip_size_kb": 0, 00:24:26.459 "state": "online", 00:24:26.459 "raid_level": "raid1", 00:24:26.459 "superblock": true, 00:24:26.459 "num_base_bdevs": 2, 00:24:26.459 "num_base_bdevs_discovered": 1, 00:24:26.459 "num_base_bdevs_operational": 1, 00:24:26.459 "base_bdevs_list": [ 00:24:26.459 { 00:24:26.459 "name": null, 00:24:26.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.459 "is_configured": false, 00:24:26.459 "data_offset": 2048, 00:24:26.459 "data_size": 63488 00:24:26.459 }, 00:24:26.459 { 00:24:26.459 "name": "BaseBdev2", 00:24:26.459 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:26.459 "is_configured": true, 00:24:26.459 "data_offset": 2048, 00:24:26.459 "data_size": 63488 00:24:26.459 } 00:24:26.459 ] 00:24:26.459 }' 00:24:26.459 22:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.459 22:31:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.028 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.288 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.288 "name": "raid_bdev1", 00:24:27.288 "uuid": "eee2aa05-279d-422c-845c-9aef57044df3", 00:24:27.288 "strip_size_kb": 0, 00:24:27.288 "state": "online", 00:24:27.288 "raid_level": "raid1", 00:24:27.288 "superblock": true, 00:24:27.288 "num_base_bdevs": 2, 00:24:27.288 "num_base_bdevs_discovered": 1, 00:24:27.288 "num_base_bdevs_operational": 1, 00:24:27.288 "base_bdevs_list": [ 00:24:27.288 { 00:24:27.288 "name": null, 00:24:27.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.288 "is_configured": false, 00:24:27.288 "data_offset": 2048, 00:24:27.288 "data_size": 63488 00:24:27.288 }, 00:24:27.288 { 00:24:27.288 "name": "BaseBdev2", 00:24:27.288 "uuid": "00a8a65b-b3a4-5ae2-a5e4-7f21a9afb449", 00:24:27.288 "is_configured": true, 00:24:27.288 "data_offset": 2048, 00:24:27.288 "data_size": 63488 00:24:27.288 } 00:24:27.288 ] 00:24:27.288 }' 00:24:27.288 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.288 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.288 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 3530976 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3530976 ']' 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 3530976 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3530976 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3530976' 00:24:27.548 killing process with pid 3530976 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 3530976 00:24:27.548 Received shutdown signal, test time was about 60.000000 seconds 00:24:27.548 00:24:27.548 Latency(us) 00:24:27.548 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:27.548 =================================================================================================================== 00:24:27.548 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:27.548 [2024-07-12 22:31:37.669429] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:27.548 [2024-07-12 22:31:37.669526] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:27.548 [2024-07-12 22:31:37.669571] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:27.548 [2024-07-12 22:31:37.669585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd98260 name raid_bdev1, state offline 00:24:27.548 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 3530976 00:24:27.548 [2024-07-12 22:31:37.700535] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:27.808 22:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:27.808 00:24:27.808 real 0m35.751s 00:24:27.808 user 0m51.900s 00:24:27.808 sys 0m6.656s 00:24:27.808 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:27.808 22:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:27.808 ************************************ 00:24:27.808 END TEST raid_rebuild_test_sb 00:24:27.808 ************************************ 00:24:27.808 22:31:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:27.808 22:31:37 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:27.808 22:31:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:27.808 22:31:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:27.808 22:31:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:27.808 ************************************ 00:24:27.808 START TEST raid_rebuild_test_io 00:24:27.808 ************************************ 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3536066 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3536066 /var/tmp/spdk-raid.sock 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 3536066 ']' 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:27.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:27.808 22:31:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.808 [2024-07-12 22:31:38.090973] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:24:27.808 [2024-07-12 22:31:38.091048] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3536066 ] 00:24:27.808 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:27.808 Zero copy mechanism will not be used. 00:24:28.068 [2024-07-12 22:31:38.223260] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:28.068 [2024-07-12 22:31:38.328796] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:28.068 [2024-07-12 22:31:38.388157] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:28.068 [2024-07-12 22:31:38.388186] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:29.028 22:31:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:29.028 22:31:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:29.028 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:29.028 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:29.028 BaseBdev1_malloc 00:24:29.028 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:29.287 [2024-07-12 22:31:39.504953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:29.287 [2024-07-12 22:31:39.505001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.287 [2024-07-12 22:31:39.505024] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe76d40 00:24:29.287 [2024-07-12 22:31:39.505042] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.287 [2024-07-12 22:31:39.506662] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.287 [2024-07-12 22:31:39.506690] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:29.287 BaseBdev1 00:24:29.287 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:29.287 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:29.546 BaseBdev2_malloc 00:24:29.546 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:29.805 [2024-07-12 22:31:39.938962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:29.805 [2024-07-12 22:31:39.939006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.805 [2024-07-12 22:31:39.939029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe77860 00:24:29.805 [2024-07-12 22:31:39.939041] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.805 [2024-07-12 22:31:39.940418] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.805 [2024-07-12 22:31:39.940445] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:29.805 BaseBdev2 00:24:29.805 22:31:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:30.064 spare_malloc 00:24:30.064 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:30.322 spare_delay 00:24:30.322 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:30.581 [2024-07-12 22:31:40.681687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:30.581 [2024-07-12 22:31:40.681732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.581 [2024-07-12 22:31:40.681752] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1025ec0 00:24:30.581 [2024-07-12 22:31:40.681764] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.581 [2024-07-12 22:31:40.683191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.581 [2024-07-12 22:31:40.683217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:30.581 spare 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:30.581 [2024-07-12 22:31:40.858183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:30.581 [2024-07-12 22:31:40.859437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:30.581 [2024-07-12 22:31:40.859511] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1027070 00:24:30.581 [2024-07-12 22:31:40.859521] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:30.581 [2024-07-12 22:31:40.859734] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1020490 00:24:30.581 [2024-07-12 22:31:40.859874] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1027070 00:24:30.581 [2024-07-12 22:31:40.859885] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1027070 00:24:30.581 [2024-07-12 22:31:40.860008] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.581 22:31:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.840 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.840 "name": "raid_bdev1", 00:24:30.840 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:30.840 "strip_size_kb": 0, 00:24:30.840 "state": "online", 00:24:30.840 "raid_level": "raid1", 00:24:30.840 "superblock": false, 00:24:30.840 "num_base_bdevs": 2, 00:24:30.840 "num_base_bdevs_discovered": 2, 00:24:30.840 "num_base_bdevs_operational": 2, 00:24:30.840 "base_bdevs_list": [ 00:24:30.840 { 00:24:30.840 "name": "BaseBdev1", 00:24:30.840 "uuid": "14e801cc-cf3d-548e-969d-91627ce2590c", 00:24:30.840 "is_configured": true, 00:24:30.840 "data_offset": 0, 00:24:30.840 "data_size": 65536 00:24:30.840 }, 00:24:30.840 { 00:24:30.840 "name": "BaseBdev2", 00:24:30.840 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:30.840 "is_configured": true, 00:24:30.840 "data_offset": 0, 00:24:30.840 "data_size": 65536 00:24:30.840 } 00:24:30.840 ] 00:24:30.840 }' 00:24:30.840 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.840 22:31:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:31.407 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:31.407 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:31.665 [2024-07-12 22:31:41.925255] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:31.665 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:31.665 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.665 22:31:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:31.923 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:31.923 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:31.923 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:31.924 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:32.182 [2024-07-12 22:31:42.324201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1021bd0 00:24:32.182 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:32.182 Zero copy mechanism will not be used. 00:24:32.182 Running I/O for 60 seconds... 00:24:32.182 [2024-07-12 22:31:42.441442] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:32.182 [2024-07-12 22:31:42.449658] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1021bd0 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.182 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.441 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.441 "name": "raid_bdev1", 00:24:32.441 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:32.441 "strip_size_kb": 0, 00:24:32.441 "state": "online", 00:24:32.441 "raid_level": "raid1", 00:24:32.441 "superblock": false, 00:24:32.441 "num_base_bdevs": 2, 00:24:32.441 "num_base_bdevs_discovered": 1, 00:24:32.441 "num_base_bdevs_operational": 1, 00:24:32.441 "base_bdevs_list": [ 00:24:32.441 { 00:24:32.441 "name": null, 00:24:32.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.441 "is_configured": false, 00:24:32.441 "data_offset": 0, 00:24:32.441 "data_size": 65536 00:24:32.441 }, 00:24:32.441 { 00:24:32.441 "name": "BaseBdev2", 00:24:32.441 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:32.441 "is_configured": true, 00:24:32.441 "data_offset": 0, 00:24:32.441 "data_size": 65536 00:24:32.441 } 00:24:32.441 ] 00:24:32.441 }' 00:24:32.441 22:31:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.441 22:31:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:33.025 22:31:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:33.283 [2024-07-12 22:31:43.508941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:33.283 [2024-07-12 22:31:43.568201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfa98b0 00:24:33.283 [2024-07-12 22:31:43.570560] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:33.283 22:31:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:33.542 [2024-07-12 22:31:43.697231] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:33.542 [2024-07-12 22:31:43.697669] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:34.110 [2024-07-12 22:31:44.342520] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:34.368 [2024-07-12 22:31:44.563309] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:34.368 [2024-07-12 22:31:44.563632] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.368 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.627 [2024-07-12 22:31:44.793059] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:34.627 [2024-07-12 22:31:44.793279] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.627 "name": "raid_bdev1", 00:24:34.627 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:34.627 "strip_size_kb": 0, 00:24:34.627 "state": "online", 00:24:34.627 "raid_level": "raid1", 00:24:34.627 "superblock": false, 00:24:34.627 "num_base_bdevs": 2, 00:24:34.627 "num_base_bdevs_discovered": 2, 00:24:34.627 "num_base_bdevs_operational": 2, 00:24:34.627 "process": { 00:24:34.627 "type": "rebuild", 00:24:34.627 "target": "spare", 00:24:34.627 "progress": { 00:24:34.627 "blocks": 16384, 00:24:34.627 "percent": 25 00:24:34.627 } 00:24:34.627 }, 00:24:34.627 "base_bdevs_list": [ 00:24:34.627 { 00:24:34.627 "name": "spare", 00:24:34.627 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:34.627 "is_configured": true, 00:24:34.627 "data_offset": 0, 00:24:34.627 "data_size": 65536 00:24:34.627 }, 00:24:34.627 { 00:24:34.627 "name": "BaseBdev2", 00:24:34.627 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:34.627 "is_configured": true, 00:24:34.627 "data_offset": 0, 00:24:34.627 "data_size": 65536 00:24:34.627 } 00:24:34.627 ] 00:24:34.627 }' 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:34.627 22:31:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:34.886 [2024-07-12 22:31:45.145520] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.145 [2024-07-12 22:31:45.343118] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:35.145 [2024-07-12 22:31:45.344987] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.145 [2024-07-12 22:31:45.345016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.145 [2024-07-12 22:31:45.345027] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:35.145 [2024-07-12 22:31:45.358700] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1021bd0 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.145 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.405 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.405 "name": "raid_bdev1", 00:24:35.405 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:35.405 "strip_size_kb": 0, 00:24:35.405 "state": "online", 00:24:35.405 "raid_level": "raid1", 00:24:35.405 "superblock": false, 00:24:35.405 "num_base_bdevs": 2, 00:24:35.405 "num_base_bdevs_discovered": 1, 00:24:35.405 "num_base_bdevs_operational": 1, 00:24:35.405 "base_bdevs_list": [ 00:24:35.405 { 00:24:35.405 "name": null, 00:24:35.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.405 "is_configured": false, 00:24:35.405 "data_offset": 0, 00:24:35.405 "data_size": 65536 00:24:35.405 }, 00:24:35.405 { 00:24:35.405 "name": "BaseBdev2", 00:24:35.405 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:35.405 "is_configured": true, 00:24:35.405 "data_offset": 0, 00:24:35.405 "data_size": 65536 00:24:35.405 } 00:24:35.405 ] 00:24:35.405 }' 00:24:35.405 22:31:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.405 22:31:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.973 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.233 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.233 "name": "raid_bdev1", 00:24:36.233 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:36.233 "strip_size_kb": 0, 00:24:36.233 "state": "online", 00:24:36.233 "raid_level": "raid1", 00:24:36.233 "superblock": false, 00:24:36.233 "num_base_bdevs": 2, 00:24:36.233 "num_base_bdevs_discovered": 1, 00:24:36.233 "num_base_bdevs_operational": 1, 00:24:36.233 "base_bdevs_list": [ 00:24:36.233 { 00:24:36.233 "name": null, 00:24:36.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.233 "is_configured": false, 00:24:36.233 "data_offset": 0, 00:24:36.233 "data_size": 65536 00:24:36.233 }, 00:24:36.233 { 00:24:36.233 "name": "BaseBdev2", 00:24:36.233 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:36.233 "is_configured": true, 00:24:36.233 "data_offset": 0, 00:24:36.233 "data_size": 65536 00:24:36.233 } 00:24:36.233 ] 00:24:36.233 }' 00:24:36.493 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:36.493 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:36.493 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.493 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:36.493 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:36.752 [2024-07-12 22:31:46.862536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:36.752 22:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:36.752 [2024-07-12 22:31:46.937516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1027450 00:24:36.752 [2024-07-12 22:31:46.939008] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:36.752 [2024-07-12 22:31:47.065472] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:36.752 [2024-07-12 22:31:47.065948] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:37.010 [2024-07-12 22:31:47.192915] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:37.011 [2024-07-12 22:31:47.193066] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:37.270 [2024-07-12 22:31:47.566996] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:37.530 [2024-07-12 22:31:47.805059] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:37.530 [2024-07-12 22:31:47.805339] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:37.829 [2024-07-12 22:31:47.917090] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:37.829 [2024-07-12 22:31:47.917230] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.829 22:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.829 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:37.829 "name": "raid_bdev1", 00:24:37.829 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:37.829 "strip_size_kb": 0, 00:24:37.829 "state": "online", 00:24:37.829 "raid_level": "raid1", 00:24:37.829 "superblock": false, 00:24:37.829 "num_base_bdevs": 2, 00:24:37.829 "num_base_bdevs_discovered": 2, 00:24:37.829 "num_base_bdevs_operational": 2, 00:24:37.829 "process": { 00:24:37.829 "type": "rebuild", 00:24:37.829 "target": "spare", 00:24:37.829 "progress": { 00:24:37.829 "blocks": 18432, 00:24:37.829 "percent": 28 00:24:37.829 } 00:24:37.829 }, 00:24:37.829 "base_bdevs_list": [ 00:24:37.829 { 00:24:37.829 "name": "spare", 00:24:37.829 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:37.829 "is_configured": true, 00:24:37.829 "data_offset": 0, 00:24:37.829 "data_size": 65536 00:24:37.829 }, 00:24:37.829 { 00:24:37.829 "name": "BaseBdev2", 00:24:37.829 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:37.829 "is_configured": true, 00:24:37.829 "data_offset": 0, 00:24:37.829 "data_size": 65536 00:24:37.829 } 00:24:37.829 ] 00:24:37.829 }' 00:24:37.829 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=812 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.090 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.090 [2024-07-12 22:31:48.309771] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.349 "name": "raid_bdev1", 00:24:38.349 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:38.349 "strip_size_kb": 0, 00:24:38.349 "state": "online", 00:24:38.349 "raid_level": "raid1", 00:24:38.349 "superblock": false, 00:24:38.349 "num_base_bdevs": 2, 00:24:38.349 "num_base_bdevs_discovered": 2, 00:24:38.349 "num_base_bdevs_operational": 2, 00:24:38.349 "process": { 00:24:38.349 "type": "rebuild", 00:24:38.349 "target": "spare", 00:24:38.349 "progress": { 00:24:38.349 "blocks": 24576, 00:24:38.349 "percent": 37 00:24:38.349 } 00:24:38.349 }, 00:24:38.349 "base_bdevs_list": [ 00:24:38.349 { 00:24:38.349 "name": "spare", 00:24:38.349 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:38.349 "is_configured": true, 00:24:38.349 "data_offset": 0, 00:24:38.349 "data_size": 65536 00:24:38.349 }, 00:24:38.349 { 00:24:38.349 "name": "BaseBdev2", 00:24:38.349 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:38.349 "is_configured": true, 00:24:38.349 "data_offset": 0, 00:24:38.349 "data_size": 65536 00:24:38.349 } 00:24:38.349 ] 00:24:38.349 }' 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.349 22:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:38.608 [2024-07-12 22:31:48.869074] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:39.173 [2024-07-12 22:31:49.216455] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.431 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.432 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.432 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.432 [2024-07-12 22:31:49.621015] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:39.432 [2024-07-12 22:31:49.730507] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.690 "name": "raid_bdev1", 00:24:39.690 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:39.690 "strip_size_kb": 0, 00:24:39.690 "state": "online", 00:24:39.690 "raid_level": "raid1", 00:24:39.690 "superblock": false, 00:24:39.690 "num_base_bdevs": 2, 00:24:39.690 "num_base_bdevs_discovered": 2, 00:24:39.690 "num_base_bdevs_operational": 2, 00:24:39.690 "process": { 00:24:39.690 "type": "rebuild", 00:24:39.690 "target": "spare", 00:24:39.690 "progress": { 00:24:39.690 "blocks": 47104, 00:24:39.690 "percent": 71 00:24:39.690 } 00:24:39.690 }, 00:24:39.690 "base_bdevs_list": [ 00:24:39.690 { 00:24:39.690 "name": "spare", 00:24:39.690 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:39.690 "is_configured": true, 00:24:39.690 "data_offset": 0, 00:24:39.690 "data_size": 65536 00:24:39.690 }, 00:24:39.690 { 00:24:39.690 "name": "BaseBdev2", 00:24:39.690 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:39.690 "is_configured": true, 00:24:39.690 "data_offset": 0, 00:24:39.690 "data_size": 65536 00:24:39.690 } 00:24:39.690 ] 00:24:39.690 }' 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.690 22:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:39.950 [2024-07-12 22:31:50.069951] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:24:40.209 [2024-07-12 22:31:50.281018] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:24:40.209 [2024-07-12 22:31:50.281199] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:24:40.209 [2024-07-12 22:31:50.529071] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.776 22:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.776 [2024-07-12 22:31:50.978285] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:40.776 [2024-07-12 22:31:51.086546] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:40.776 [2024-07-12 22:31:51.088683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.034 "name": "raid_bdev1", 00:24:41.034 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:41.034 "strip_size_kb": 0, 00:24:41.034 "state": "online", 00:24:41.034 "raid_level": "raid1", 00:24:41.034 "superblock": false, 00:24:41.034 "num_base_bdevs": 2, 00:24:41.034 "num_base_bdevs_discovered": 2, 00:24:41.034 "num_base_bdevs_operational": 2, 00:24:41.034 "base_bdevs_list": [ 00:24:41.034 { 00:24:41.034 "name": "spare", 00:24:41.034 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:41.034 "is_configured": true, 00:24:41.034 "data_offset": 0, 00:24:41.034 "data_size": 65536 00:24:41.034 }, 00:24:41.034 { 00:24:41.034 "name": "BaseBdev2", 00:24:41.034 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:41.034 "is_configured": true, 00:24:41.034 "data_offset": 0, 00:24:41.034 "data_size": 65536 00:24:41.034 } 00:24:41.034 ] 00:24:41.034 }' 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.034 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.293 "name": "raid_bdev1", 00:24:41.293 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:41.293 "strip_size_kb": 0, 00:24:41.293 "state": "online", 00:24:41.293 "raid_level": "raid1", 00:24:41.293 "superblock": false, 00:24:41.293 "num_base_bdevs": 2, 00:24:41.293 "num_base_bdevs_discovered": 2, 00:24:41.293 "num_base_bdevs_operational": 2, 00:24:41.293 "base_bdevs_list": [ 00:24:41.293 { 00:24:41.293 "name": "spare", 00:24:41.293 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:41.293 "is_configured": true, 00:24:41.293 "data_offset": 0, 00:24:41.293 "data_size": 65536 00:24:41.293 }, 00:24:41.293 { 00:24:41.293 "name": "BaseBdev2", 00:24:41.293 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:41.293 "is_configured": true, 00:24:41.293 "data_offset": 0, 00:24:41.293 "data_size": 65536 00:24:41.293 } 00:24:41.293 ] 00:24:41.293 }' 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.293 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.551 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.551 "name": "raid_bdev1", 00:24:41.551 "uuid": "adbfd812-9850-411c-98dc-f6498615d622", 00:24:41.551 "strip_size_kb": 0, 00:24:41.551 "state": "online", 00:24:41.551 "raid_level": "raid1", 00:24:41.552 "superblock": false, 00:24:41.552 "num_base_bdevs": 2, 00:24:41.552 "num_base_bdevs_discovered": 2, 00:24:41.552 "num_base_bdevs_operational": 2, 00:24:41.552 "base_bdevs_list": [ 00:24:41.552 { 00:24:41.552 "name": "spare", 00:24:41.552 "uuid": "bacf46d1-5adf-5d0a-8d71-30ed888a40e8", 00:24:41.552 "is_configured": true, 00:24:41.552 "data_offset": 0, 00:24:41.552 "data_size": 65536 00:24:41.552 }, 00:24:41.552 { 00:24:41.552 "name": "BaseBdev2", 00:24:41.552 "uuid": "dc185903-0400-5d1f-9869-779cac8c16a7", 00:24:41.552 "is_configured": true, 00:24:41.552 "data_offset": 0, 00:24:41.552 "data_size": 65536 00:24:41.552 } 00:24:41.552 ] 00:24:41.552 }' 00:24:41.552 22:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.552 22:31:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:42.118 22:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:42.378 [2024-07-12 22:31:52.630252] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:42.378 [2024-07-12 22:31:52.630284] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:42.652 00:24:42.652 Latency(us) 00:24:42.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:42.652 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:42.652 raid_bdev1 : 10.39 102.53 307.58 0.00 0.00 12738.20 295.62 110784.33 00:24:42.652 =================================================================================================================== 00:24:42.652 Total : 102.53 307.58 0.00 0.00 12738.20 295.62 110784.33 00:24:42.652 [2024-07-12 22:31:52.743429] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.652 [2024-07-12 22:31:52.743472] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:42.652 [2024-07-12 22:31:52.743547] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:42.652 [2024-07-12 22:31:52.743559] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1027070 name raid_bdev1, state offline 00:24:42.652 0 00:24:42.652 22:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.652 22:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:42.917 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:43.175 /dev/nbd0 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.175 1+0 records in 00:24:43.175 1+0 records out 00:24:43.175 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290754 s, 14.1 MB/s 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.175 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:43.434 /dev/nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.434 1+0 records in 00:24:43.434 1+0 records out 00:24:43.434 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254456 s, 16.1 MB/s 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.434 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.693 22:31:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:43.951 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 3536066 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 3536066 ']' 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 3536066 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3536066 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3536066' 00:24:44.210 killing process with pid 3536066 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 3536066 00:24:44.210 Received shutdown signal, test time was about 11.978342 seconds 00:24:44.210 00:24:44.210 Latency(us) 00:24:44.210 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:44.210 =================================================================================================================== 00:24:44.210 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:44.210 [2024-07-12 22:31:54.334135] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:44.210 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 3536066 00:24:44.210 [2024-07-12 22:31:54.355946] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:44.469 00:24:44.469 real 0m16.571s 00:24:44.469 user 0m25.302s 00:24:44.469 sys 0m2.839s 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:44.469 ************************************ 00:24:44.469 END TEST raid_rebuild_test_io 00:24:44.469 ************************************ 00:24:44.469 22:31:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:44.469 22:31:54 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:44.469 22:31:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:44.469 22:31:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:44.469 22:31:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:44.469 ************************************ 00:24:44.469 START TEST raid_rebuild_test_sb_io 00:24:44.469 ************************************ 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:44.469 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3538508 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3538508 /var/tmp/spdk-raid.sock 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 3538508 ']' 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:44.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:44.470 22:31:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:44.470 [2024-07-12 22:31:54.751527] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:24:44.470 [2024-07-12 22:31:54.751602] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3538508 ] 00:24:44.470 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:44.470 Zero copy mechanism will not be used. 00:24:44.729 [2024-07-12 22:31:54.880156] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:44.729 [2024-07-12 22:31:54.983050] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:44.729 [2024-07-12 22:31:55.041884] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:44.729 [2024-07-12 22:31:55.041916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:45.667 22:31:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:45.667 22:31:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:45.667 22:31:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:45.667 22:31:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:45.667 BaseBdev1_malloc 00:24:45.667 22:31:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:45.926 [2024-07-12 22:31:56.162268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:45.926 [2024-07-12 22:31:56.162321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.926 [2024-07-12 22:31:56.162344] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad7d40 00:24:45.926 [2024-07-12 22:31:56.162357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.926 [2024-07-12 22:31:56.163946] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.926 [2024-07-12 22:31:56.163973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:45.926 BaseBdev1 00:24:45.926 22:31:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:45.926 22:31:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:46.184 BaseBdev2_malloc 00:24:46.184 22:31:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:46.443 [2024-07-12 22:31:56.656301] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:46.443 [2024-07-12 22:31:56.656344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.443 [2024-07-12 22:31:56.656366] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad8860 00:24:46.443 [2024-07-12 22:31:56.656379] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.443 [2024-07-12 22:31:56.657739] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.443 [2024-07-12 22:31:56.657765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:46.443 BaseBdev2 00:24:46.444 22:31:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:46.703 spare_malloc 00:24:46.703 22:31:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:46.963 spare_delay 00:24:46.963 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:47.223 [2024-07-12 22:31:57.386814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:47.223 [2024-07-12 22:31:57.386862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.223 [2024-07-12 22:31:57.386881] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c86ec0 00:24:47.223 [2024-07-12 22:31:57.386894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.223 [2024-07-12 22:31:57.388329] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.223 [2024-07-12 22:31:57.388357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:47.223 spare 00:24:47.223 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:47.482 [2024-07-12 22:31:57.627479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:47.482 [2024-07-12 22:31:57.628656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:47.482 [2024-07-12 22:31:57.628821] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c88070 00:24:47.482 [2024-07-12 22:31:57.628834] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:47.482 [2024-07-12 22:31:57.629022] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c81490 00:24:47.482 [2024-07-12 22:31:57.629160] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c88070 00:24:47.482 [2024-07-12 22:31:57.629170] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c88070 00:24:47.482 [2024-07-12 22:31:57.629261] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.482 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.742 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:47.742 "name": "raid_bdev1", 00:24:47.742 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:47.742 "strip_size_kb": 0, 00:24:47.742 "state": "online", 00:24:47.742 "raid_level": "raid1", 00:24:47.742 "superblock": true, 00:24:47.742 "num_base_bdevs": 2, 00:24:47.742 "num_base_bdevs_discovered": 2, 00:24:47.742 "num_base_bdevs_operational": 2, 00:24:47.742 "base_bdevs_list": [ 00:24:47.742 { 00:24:47.742 "name": "BaseBdev1", 00:24:47.742 "uuid": "0ef35ce7-c523-590f-b47e-c0774e3be150", 00:24:47.742 "is_configured": true, 00:24:47.742 "data_offset": 2048, 00:24:47.742 "data_size": 63488 00:24:47.742 }, 00:24:47.742 { 00:24:47.742 "name": "BaseBdev2", 00:24:47.742 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:47.742 "is_configured": true, 00:24:47.742 "data_offset": 2048, 00:24:47.742 "data_size": 63488 00:24:47.742 } 00:24:47.742 ] 00:24:47.742 }' 00:24:47.742 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:47.742 22:31:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:48.311 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:48.311 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:48.571 [2024-07-12 22:31:58.730631] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:48.571 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:48.571 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.571 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:48.831 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:48.831 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:48.831 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:48.831 22:31:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:48.831 [2024-07-12 22:31:59.093435] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88c50 00:24:48.831 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:48.831 Zero copy mechanism will not be used. 00:24:48.831 Running I/O for 60 seconds... 00:24:49.091 [2024-07-12 22:31:59.217469] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:49.091 [2024-07-12 22:31:59.225638] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c88c50 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.091 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.350 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.350 "name": "raid_bdev1", 00:24:49.350 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:49.350 "strip_size_kb": 0, 00:24:49.350 "state": "online", 00:24:49.350 "raid_level": "raid1", 00:24:49.350 "superblock": true, 00:24:49.350 "num_base_bdevs": 2, 00:24:49.350 "num_base_bdevs_discovered": 1, 00:24:49.350 "num_base_bdevs_operational": 1, 00:24:49.350 "base_bdevs_list": [ 00:24:49.350 { 00:24:49.350 "name": null, 00:24:49.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.350 "is_configured": false, 00:24:49.350 "data_offset": 2048, 00:24:49.350 "data_size": 63488 00:24:49.350 }, 00:24:49.350 { 00:24:49.350 "name": "BaseBdev2", 00:24:49.350 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:49.350 "is_configured": true, 00:24:49.350 "data_offset": 2048, 00:24:49.350 "data_size": 63488 00:24:49.350 } 00:24:49.350 ] 00:24:49.350 }' 00:24:49.350 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.350 22:31:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:49.924 22:32:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:50.182 [2024-07-12 22:32:00.302215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:50.182 22:32:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:50.182 [2024-07-12 22:32:00.369365] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf4230 00:24:50.182 [2024-07-12 22:32:00.371749] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:50.182 [2024-07-12 22:32:00.490496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:50.182 [2024-07-12 22:32:00.490960] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:50.441 [2024-07-12 22:32:00.718640] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:50.441 [2024-07-12 22:32:00.718900] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:51.009 [2024-07-12 22:32:01.066914] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:51.009 [2024-07-12 22:32:01.211453] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:51.009 [2024-07-12 22:32:01.211696] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.268 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.268 [2024-07-12 22:32:01.534089] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:51.268 [2024-07-12 22:32:01.534453] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.526 "name": "raid_bdev1", 00:24:51.526 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:51.526 "strip_size_kb": 0, 00:24:51.526 "state": "online", 00:24:51.526 "raid_level": "raid1", 00:24:51.526 "superblock": true, 00:24:51.526 "num_base_bdevs": 2, 00:24:51.526 "num_base_bdevs_discovered": 2, 00:24:51.526 "num_base_bdevs_operational": 2, 00:24:51.526 "process": { 00:24:51.526 "type": "rebuild", 00:24:51.526 "target": "spare", 00:24:51.526 "progress": { 00:24:51.526 "blocks": 14336, 00:24:51.526 "percent": 22 00:24:51.526 } 00:24:51.526 }, 00:24:51.526 "base_bdevs_list": [ 00:24:51.526 { 00:24:51.526 "name": "spare", 00:24:51.526 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:51.526 "is_configured": true, 00:24:51.526 "data_offset": 2048, 00:24:51.526 "data_size": 63488 00:24:51.526 }, 00:24:51.526 { 00:24:51.526 "name": "BaseBdev2", 00:24:51.526 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:51.526 "is_configured": true, 00:24:51.526 "data_offset": 2048, 00:24:51.526 "data_size": 63488 00:24:51.526 } 00:24:51.526 ] 00:24:51.526 }' 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:51.526 22:32:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:51.526 [2024-07-12 22:32:01.744180] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:51.526 [2024-07-12 22:32:01.744412] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:51.786 [2024-07-12 22:32:01.920266] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.786 [2024-07-12 22:32:02.082339] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:51.786 [2024-07-12 22:32:02.092141] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.786 [2024-07-12 22:32:02.092172] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.786 [2024-07-12 22:32:02.092183] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:52.045 [2024-07-12 22:32:02.122505] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c88c50 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.045 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.451 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.451 "name": "raid_bdev1", 00:24:52.451 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:52.451 "strip_size_kb": 0, 00:24:52.451 "state": "online", 00:24:52.451 "raid_level": "raid1", 00:24:52.451 "superblock": true, 00:24:52.451 "num_base_bdevs": 2, 00:24:52.451 "num_base_bdevs_discovered": 1, 00:24:52.451 "num_base_bdevs_operational": 1, 00:24:52.451 "base_bdevs_list": [ 00:24:52.451 { 00:24:52.451 "name": null, 00:24:52.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.451 "is_configured": false, 00:24:52.451 "data_offset": 2048, 00:24:52.451 "data_size": 63488 00:24:52.451 }, 00:24:52.452 { 00:24:52.452 "name": "BaseBdev2", 00:24:52.452 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:52.452 "is_configured": true, 00:24:52.452 "data_offset": 2048, 00:24:52.452 "data_size": 63488 00:24:52.452 } 00:24:52.452 ] 00:24:52.452 }' 00:24:52.452 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.452 22:32:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.728 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:52.999 "name": "raid_bdev1", 00:24:52.999 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:52.999 "strip_size_kb": 0, 00:24:52.999 "state": "online", 00:24:52.999 "raid_level": "raid1", 00:24:52.999 "superblock": true, 00:24:52.999 "num_base_bdevs": 2, 00:24:52.999 "num_base_bdevs_discovered": 1, 00:24:52.999 "num_base_bdevs_operational": 1, 00:24:52.999 "base_bdevs_list": [ 00:24:52.999 { 00:24:52.999 "name": null, 00:24:52.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.999 "is_configured": false, 00:24:52.999 "data_offset": 2048, 00:24:52.999 "data_size": 63488 00:24:52.999 }, 00:24:52.999 { 00:24:52.999 "name": "BaseBdev2", 00:24:52.999 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:52.999 "is_configured": true, 00:24:52.999 "data_offset": 2048, 00:24:52.999 "data_size": 63488 00:24:52.999 } 00:24:52.999 ] 00:24:52.999 }' 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:52.999 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:53.258 [2024-07-12 22:32:03.511895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:53.258 22:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:53.517 [2024-07-12 22:32:03.595653] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c88e60 00:24:53.517 [2024-07-12 22:32:03.597158] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:53.517 [2024-07-12 22:32:03.716083] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:53.517 [2024-07-12 22:32:03.716634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:53.777 [2024-07-12 22:32:03.845340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:53.777 [2024-07-12 22:32:03.845539] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:54.035 [2024-07-12 22:32:04.102725] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:54.035 [2024-07-12 22:32:04.103204] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:54.035 [2024-07-12 22:32:04.313566] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:54.035 [2024-07-12 22:32:04.313717] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.293 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.553 [2024-07-12 22:32:04.781542] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:54.553 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:54.553 "name": "raid_bdev1", 00:24:54.553 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:54.553 "strip_size_kb": 0, 00:24:54.553 "state": "online", 00:24:54.553 "raid_level": "raid1", 00:24:54.553 "superblock": true, 00:24:54.553 "num_base_bdevs": 2, 00:24:54.553 "num_base_bdevs_discovered": 2, 00:24:54.553 "num_base_bdevs_operational": 2, 00:24:54.553 "process": { 00:24:54.553 "type": "rebuild", 00:24:54.553 "target": "spare", 00:24:54.553 "progress": { 00:24:54.553 "blocks": 16384, 00:24:54.553 "percent": 25 00:24:54.553 } 00:24:54.553 }, 00:24:54.553 "base_bdevs_list": [ 00:24:54.553 { 00:24:54.553 "name": "spare", 00:24:54.553 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:54.553 "is_configured": true, 00:24:54.553 "data_offset": 2048, 00:24:54.553 "data_size": 63488 00:24:54.553 }, 00:24:54.553 { 00:24:54.553 "name": "BaseBdev2", 00:24:54.553 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:54.553 "is_configured": true, 00:24:54.553 "data_offset": 2048, 00:24:54.553 "data_size": 63488 00:24:54.553 } 00:24:54.553 ] 00:24:54.553 }' 00:24:54.553 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:54.553 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:54.553 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:54.812 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=828 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.812 22:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.071 [2024-07-12 22:32:05.163123] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:55.071 [2024-07-12 22:32:05.163407] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.071 "name": "raid_bdev1", 00:24:55.071 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:55.071 "strip_size_kb": 0, 00:24:55.071 "state": "online", 00:24:55.071 "raid_level": "raid1", 00:24:55.071 "superblock": true, 00:24:55.071 "num_base_bdevs": 2, 00:24:55.071 "num_base_bdevs_discovered": 2, 00:24:55.071 "num_base_bdevs_operational": 2, 00:24:55.071 "process": { 00:24:55.071 "type": "rebuild", 00:24:55.071 "target": "spare", 00:24:55.071 "progress": { 00:24:55.071 "blocks": 20480, 00:24:55.071 "percent": 32 00:24:55.071 } 00:24:55.071 }, 00:24:55.071 "base_bdevs_list": [ 00:24:55.071 { 00:24:55.071 "name": "spare", 00:24:55.071 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:55.071 "is_configured": true, 00:24:55.071 "data_offset": 2048, 00:24:55.071 "data_size": 63488 00:24:55.071 }, 00:24:55.071 { 00:24:55.071 "name": "BaseBdev2", 00:24:55.071 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:55.071 "is_configured": true, 00:24:55.071 "data_offset": 2048, 00:24:55.071 "data_size": 63488 00:24:55.071 } 00:24:55.071 ] 00:24:55.071 }' 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:55.071 22:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:55.329 [2024-07-12 22:32:05.622007] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:55.329 [2024-07-12 22:32:05.631213] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.266 [2024-07-12 22:32:06.308627] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:56.266 [2024-07-12 22:32:06.309060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.266 "name": "raid_bdev1", 00:24:56.266 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:56.266 "strip_size_kb": 0, 00:24:56.266 "state": "online", 00:24:56.266 "raid_level": "raid1", 00:24:56.266 "superblock": true, 00:24:56.266 "num_base_bdevs": 2, 00:24:56.266 "num_base_bdevs_discovered": 2, 00:24:56.266 "num_base_bdevs_operational": 2, 00:24:56.266 "process": { 00:24:56.266 "type": "rebuild", 00:24:56.266 "target": "spare", 00:24:56.266 "progress": { 00:24:56.266 "blocks": 38912, 00:24:56.266 "percent": 61 00:24:56.266 } 00:24:56.266 }, 00:24:56.266 "base_bdevs_list": [ 00:24:56.266 { 00:24:56.266 "name": "spare", 00:24:56.266 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:56.266 "is_configured": true, 00:24:56.266 "data_offset": 2048, 00:24:56.266 "data_size": 63488 00:24:56.266 }, 00:24:56.266 { 00:24:56.266 "name": "BaseBdev2", 00:24:56.266 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:56.266 "is_configured": true, 00:24:56.266 "data_offset": 2048, 00:24:56.266 "data_size": 63488 00:24:56.266 } 00:24:56.266 ] 00:24:56.266 }' 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.266 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.526 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.526 22:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:56.526 [2024-07-12 22:32:06.748102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:56.785 [2024-07-12 22:32:06.951357] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:57.353 [2024-07-12 22:32:07.593634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.353 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.611 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.612 "name": "raid_bdev1", 00:24:57.612 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:57.612 "strip_size_kb": 0, 00:24:57.612 "state": "online", 00:24:57.612 "raid_level": "raid1", 00:24:57.612 "superblock": true, 00:24:57.612 "num_base_bdevs": 2, 00:24:57.612 "num_base_bdevs_discovered": 2, 00:24:57.612 "num_base_bdevs_operational": 2, 00:24:57.612 "process": { 00:24:57.612 "type": "rebuild", 00:24:57.612 "target": "spare", 00:24:57.612 "progress": { 00:24:57.612 "blocks": 61440, 00:24:57.612 "percent": 96 00:24:57.612 } 00:24:57.612 }, 00:24:57.612 "base_bdevs_list": [ 00:24:57.612 { 00:24:57.612 "name": "spare", 00:24:57.612 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:57.612 "is_configured": true, 00:24:57.612 "data_offset": 2048, 00:24:57.612 "data_size": 63488 00:24:57.612 }, 00:24:57.612 { 00:24:57.612 "name": "BaseBdev2", 00:24:57.612 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:57.612 "is_configured": true, 00:24:57.612 "data_offset": 2048, 00:24:57.612 "data_size": 63488 00:24:57.612 } 00:24:57.612 ] 00:24:57.612 }' 00:24:57.612 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.612 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.612 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.612 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.612 [2024-07-12 22:32:07.932068] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:57.612 22:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:57.870 [2024-07-12 22:32:08.032303] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:57.870 [2024-07-12 22:32:08.033915] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.809 22:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.809 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.809 "name": "raid_bdev1", 00:24:58.809 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:58.809 "strip_size_kb": 0, 00:24:58.809 "state": "online", 00:24:58.809 "raid_level": "raid1", 00:24:58.809 "superblock": true, 00:24:58.809 "num_base_bdevs": 2, 00:24:58.809 "num_base_bdevs_discovered": 2, 00:24:58.809 "num_base_bdevs_operational": 2, 00:24:58.809 "base_bdevs_list": [ 00:24:58.809 { 00:24:58.809 "name": "spare", 00:24:58.809 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:58.809 "is_configured": true, 00:24:58.809 "data_offset": 2048, 00:24:58.809 "data_size": 63488 00:24:58.809 }, 00:24:58.809 { 00:24:58.809 "name": "BaseBdev2", 00:24:58.809 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:58.809 "is_configured": true, 00:24:58.809 "data_offset": 2048, 00:24:58.809 "data_size": 63488 00:24:58.809 } 00:24:58.809 ] 00:24:58.809 }' 00:24:58.809 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.069 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.328 "name": "raid_bdev1", 00:24:59.328 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:59.328 "strip_size_kb": 0, 00:24:59.328 "state": "online", 00:24:59.328 "raid_level": "raid1", 00:24:59.328 "superblock": true, 00:24:59.328 "num_base_bdevs": 2, 00:24:59.328 "num_base_bdevs_discovered": 2, 00:24:59.328 "num_base_bdevs_operational": 2, 00:24:59.328 "base_bdevs_list": [ 00:24:59.328 { 00:24:59.328 "name": "spare", 00:24:59.328 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:59.328 "is_configured": true, 00:24:59.328 "data_offset": 2048, 00:24:59.328 "data_size": 63488 00:24:59.328 }, 00:24:59.328 { 00:24:59.328 "name": "BaseBdev2", 00:24:59.328 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:59.328 "is_configured": true, 00:24:59.328 "data_offset": 2048, 00:24:59.328 "data_size": 63488 00:24:59.328 } 00:24:59.328 ] 00:24:59.328 }' 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.328 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.329 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.588 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.588 "name": "raid_bdev1", 00:24:59.588 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:24:59.588 "strip_size_kb": 0, 00:24:59.588 "state": "online", 00:24:59.588 "raid_level": "raid1", 00:24:59.588 "superblock": true, 00:24:59.588 "num_base_bdevs": 2, 00:24:59.588 "num_base_bdevs_discovered": 2, 00:24:59.588 "num_base_bdevs_operational": 2, 00:24:59.588 "base_bdevs_list": [ 00:24:59.588 { 00:24:59.588 "name": "spare", 00:24:59.588 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:24:59.588 "is_configured": true, 00:24:59.588 "data_offset": 2048, 00:24:59.588 "data_size": 63488 00:24:59.588 }, 00:24:59.588 { 00:24:59.588 "name": "BaseBdev2", 00:24:59.588 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:24:59.588 "is_configured": true, 00:24:59.588 "data_offset": 2048, 00:24:59.588 "data_size": 63488 00:24:59.588 } 00:24:59.588 ] 00:24:59.588 }' 00:24:59.588 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.588 22:32:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:00.156 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:00.417 [2024-07-12 22:32:10.616227] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:00.417 [2024-07-12 22:32:10.616258] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:00.417 00:25:00.417 Latency(us) 00:25:00.417 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:00.417 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:00.417 raid_bdev1 : 11.56 87.34 262.01 0.00 0.00 14993.70 297.41 119446.48 00:25:00.417 =================================================================================================================== 00:25:00.417 Total : 87.34 262.01 0.00 0.00 14993.70 297.41 119446.48 00:25:00.417 [2024-07-12 22:32:10.692455] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:00.417 [2024-07-12 22:32:10.692483] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:00.417 [2024-07-12 22:32:10.692557] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:00.417 [2024-07-12 22:32:10.692569] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c88070 name raid_bdev1, state offline 00:25:00.417 0 00:25:00.417 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.417 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:00.676 22:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:00.936 /dev/nbd0 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.936 1+0 records in 00:25:00.936 1+0 records out 00:25:00.936 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000440919 s, 9.3 MB/s 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:00.936 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:01.195 /dev/nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:01.195 1+0 records in 00:25:01.195 1+0 records out 00:25:01.195 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243278 s, 16.8 MB/s 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:01.195 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:01.455 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:01.714 22:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:01.973 [2024-07-12 22:32:12.240129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:01.973 [2024-07-12 22:32:12.240179] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:01.973 [2024-07-12 22:32:12.240201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad7490 00:25:01.973 [2024-07-12 22:32:12.240214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:01.973 [2024-07-12 22:32:12.241873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:01.973 [2024-07-12 22:32:12.241902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:01.973 [2024-07-12 22:32:12.241993] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:01.973 [2024-07-12 22:32:12.242021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:01.973 [2024-07-12 22:32:12.242122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:01.973 spare 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.973 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.232 [2024-07-12 22:32:12.342436] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad6f70 00:25:02.232 [2024-07-12 22:32:12.342454] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:02.232 [2024-07-12 22:32:12.342654] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae9230 00:25:02.232 [2024-07-12 22:32:12.342807] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad6f70 00:25:02.232 [2024-07-12 22:32:12.342817] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad6f70 00:25:02.232 [2024-07-12 22:32:12.342934] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.232 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.232 "name": "raid_bdev1", 00:25:02.232 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:02.232 "strip_size_kb": 0, 00:25:02.232 "state": "online", 00:25:02.232 "raid_level": "raid1", 00:25:02.232 "superblock": true, 00:25:02.232 "num_base_bdevs": 2, 00:25:02.232 "num_base_bdevs_discovered": 2, 00:25:02.232 "num_base_bdevs_operational": 2, 00:25:02.232 "base_bdevs_list": [ 00:25:02.232 { 00:25:02.232 "name": "spare", 00:25:02.232 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:25:02.232 "is_configured": true, 00:25:02.232 "data_offset": 2048, 00:25:02.232 "data_size": 63488 00:25:02.232 }, 00:25:02.232 { 00:25:02.232 "name": "BaseBdev2", 00:25:02.232 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:02.232 "is_configured": true, 00:25:02.232 "data_offset": 2048, 00:25:02.232 "data_size": 63488 00:25:02.232 } 00:25:02.232 ] 00:25:02.232 }' 00:25:02.232 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.232 22:32:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.168 "name": "raid_bdev1", 00:25:03.168 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:03.168 "strip_size_kb": 0, 00:25:03.168 "state": "online", 00:25:03.168 "raid_level": "raid1", 00:25:03.168 "superblock": true, 00:25:03.168 "num_base_bdevs": 2, 00:25:03.168 "num_base_bdevs_discovered": 2, 00:25:03.168 "num_base_bdevs_operational": 2, 00:25:03.168 "base_bdevs_list": [ 00:25:03.168 { 00:25:03.168 "name": "spare", 00:25:03.168 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:25:03.168 "is_configured": true, 00:25:03.168 "data_offset": 2048, 00:25:03.168 "data_size": 63488 00:25:03.168 }, 00:25:03.168 { 00:25:03.168 "name": "BaseBdev2", 00:25:03.168 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:03.168 "is_configured": true, 00:25:03.168 "data_offset": 2048, 00:25:03.168 "data_size": 63488 00:25:03.168 } 00:25:03.168 ] 00:25:03.168 }' 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.168 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:03.427 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:03.427 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:03.686 [2024-07-12 22:32:13.945097] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.686 22:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.945 22:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.945 "name": "raid_bdev1", 00:25:03.945 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:03.945 "strip_size_kb": 0, 00:25:03.945 "state": "online", 00:25:03.945 "raid_level": "raid1", 00:25:03.945 "superblock": true, 00:25:03.945 "num_base_bdevs": 2, 00:25:03.945 "num_base_bdevs_discovered": 1, 00:25:03.945 "num_base_bdevs_operational": 1, 00:25:03.945 "base_bdevs_list": [ 00:25:03.945 { 00:25:03.945 "name": null, 00:25:03.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.945 "is_configured": false, 00:25:03.945 "data_offset": 2048, 00:25:03.945 "data_size": 63488 00:25:03.945 }, 00:25:03.945 { 00:25:03.945 "name": "BaseBdev2", 00:25:03.945 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:03.945 "is_configured": true, 00:25:03.945 "data_offset": 2048, 00:25:03.945 "data_size": 63488 00:25:03.945 } 00:25:03.945 ] 00:25:03.945 }' 00:25:03.945 22:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.945 22:32:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:04.513 22:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:04.772 [2024-07-12 22:32:14.996013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:04.772 [2024-07-12 22:32:14.996170] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:04.772 [2024-07-12 22:32:14.996186] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:04.772 [2024-07-12 22:32:14.996215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:04.772 [2024-07-12 22:32:15.001448] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c81490 00:25:04.772 [2024-07-12 22:32:15.003765] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:04.772 22:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.709 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.968 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.968 "name": "raid_bdev1", 00:25:05.968 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:05.968 "strip_size_kb": 0, 00:25:05.968 "state": "online", 00:25:05.968 "raid_level": "raid1", 00:25:05.968 "superblock": true, 00:25:05.968 "num_base_bdevs": 2, 00:25:05.968 "num_base_bdevs_discovered": 2, 00:25:05.968 "num_base_bdevs_operational": 2, 00:25:05.968 "process": { 00:25:05.968 "type": "rebuild", 00:25:05.968 "target": "spare", 00:25:05.968 "progress": { 00:25:05.968 "blocks": 24576, 00:25:05.968 "percent": 38 00:25:05.968 } 00:25:05.968 }, 00:25:05.968 "base_bdevs_list": [ 00:25:05.968 { 00:25:05.968 "name": "spare", 00:25:05.968 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:25:05.968 "is_configured": true, 00:25:05.968 "data_offset": 2048, 00:25:05.968 "data_size": 63488 00:25:05.968 }, 00:25:05.968 { 00:25:05.968 "name": "BaseBdev2", 00:25:05.968 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:05.968 "is_configured": true, 00:25:05.968 "data_offset": 2048, 00:25:05.968 "data_size": 63488 00:25:05.968 } 00:25:05.968 ] 00:25:05.968 }' 00:25:05.968 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.226 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.226 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.226 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.227 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:06.486 [2024-07-12 22:32:16.586213] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.486 [2024-07-12 22:32:16.616433] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:06.486 [2024-07-12 22:32:16.616478] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.486 [2024-07-12 22:32:16.616493] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.486 [2024-07-12 22:32:16.616502] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.486 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.744 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.744 "name": "raid_bdev1", 00:25:06.744 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:06.744 "strip_size_kb": 0, 00:25:06.744 "state": "online", 00:25:06.744 "raid_level": "raid1", 00:25:06.744 "superblock": true, 00:25:06.744 "num_base_bdevs": 2, 00:25:06.744 "num_base_bdevs_discovered": 1, 00:25:06.744 "num_base_bdevs_operational": 1, 00:25:06.744 "base_bdevs_list": [ 00:25:06.744 { 00:25:06.744 "name": null, 00:25:06.744 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.744 "is_configured": false, 00:25:06.744 "data_offset": 2048, 00:25:06.744 "data_size": 63488 00:25:06.744 }, 00:25:06.744 { 00:25:06.744 "name": "BaseBdev2", 00:25:06.744 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:06.744 "is_configured": true, 00:25:06.744 "data_offset": 2048, 00:25:06.744 "data_size": 63488 00:25:06.744 } 00:25:06.744 ] 00:25:06.744 }' 00:25:06.744 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.744 22:32:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:07.311 22:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:07.569 [2024-07-12 22:32:17.708318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:07.569 [2024-07-12 22:32:17.708371] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.569 [2024-07-12 22:32:17.708394] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aead20 00:25:07.569 [2024-07-12 22:32:17.708407] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.569 [2024-07-12 22:32:17.708784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.569 [2024-07-12 22:32:17.708802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:07.569 [2024-07-12 22:32:17.708886] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:07.569 [2024-07-12 22:32:17.708898] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:07.569 [2024-07-12 22:32:17.708916] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:07.569 [2024-07-12 22:32:17.708947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:07.569 [2024-07-12 22:32:17.714269] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c81490 00:25:07.569 spare 00:25:07.569 [2024-07-12 22:32:17.715727] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:07.570 22:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.505 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.764 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:08.764 "name": "raid_bdev1", 00:25:08.764 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:08.764 "strip_size_kb": 0, 00:25:08.764 "state": "online", 00:25:08.764 "raid_level": "raid1", 00:25:08.764 "superblock": true, 00:25:08.764 "num_base_bdevs": 2, 00:25:08.764 "num_base_bdevs_discovered": 2, 00:25:08.764 "num_base_bdevs_operational": 2, 00:25:08.764 "process": { 00:25:08.764 "type": "rebuild", 00:25:08.764 "target": "spare", 00:25:08.764 "progress": { 00:25:08.764 "blocks": 24576, 00:25:08.764 "percent": 38 00:25:08.764 } 00:25:08.764 }, 00:25:08.764 "base_bdevs_list": [ 00:25:08.764 { 00:25:08.764 "name": "spare", 00:25:08.764 "uuid": "67f76396-c321-5b98-9850-cfc8faf3f2b4", 00:25:08.764 "is_configured": true, 00:25:08.764 "data_offset": 2048, 00:25:08.764 "data_size": 63488 00:25:08.764 }, 00:25:08.764 { 00:25:08.764 "name": "BaseBdev2", 00:25:08.764 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:08.764 "is_configured": true, 00:25:08.764 "data_offset": 2048, 00:25:08.764 "data_size": 63488 00:25:08.764 } 00:25:08.764 ] 00:25:08.764 }' 00:25:08.764 22:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.764 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:08.764 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.764 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:08.764 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:09.023 [2024-07-12 22:32:19.303057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:09.023 [2024-07-12 22:32:19.328022] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:09.023 [2024-07-12 22:32:19.328066] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:09.023 [2024-07-12 22:32:19.328082] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:09.023 [2024-07-12 22:32:19.328090] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.282 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.541 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.541 "name": "raid_bdev1", 00:25:09.541 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:09.541 "strip_size_kb": 0, 00:25:09.541 "state": "online", 00:25:09.541 "raid_level": "raid1", 00:25:09.541 "superblock": true, 00:25:09.541 "num_base_bdevs": 2, 00:25:09.541 "num_base_bdevs_discovered": 1, 00:25:09.541 "num_base_bdevs_operational": 1, 00:25:09.541 "base_bdevs_list": [ 00:25:09.541 { 00:25:09.541 "name": null, 00:25:09.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.541 "is_configured": false, 00:25:09.541 "data_offset": 2048, 00:25:09.541 "data_size": 63488 00:25:09.541 }, 00:25:09.541 { 00:25:09.541 "name": "BaseBdev2", 00:25:09.541 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:09.541 "is_configured": true, 00:25:09.541 "data_offset": 2048, 00:25:09.541 "data_size": 63488 00:25:09.541 } 00:25:09.541 ] 00:25:09.541 }' 00:25:09.541 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.541 22:32:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.108 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.367 "name": "raid_bdev1", 00:25:10.367 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:10.367 "strip_size_kb": 0, 00:25:10.367 "state": "online", 00:25:10.367 "raid_level": "raid1", 00:25:10.367 "superblock": true, 00:25:10.367 "num_base_bdevs": 2, 00:25:10.367 "num_base_bdevs_discovered": 1, 00:25:10.367 "num_base_bdevs_operational": 1, 00:25:10.367 "base_bdevs_list": [ 00:25:10.367 { 00:25:10.367 "name": null, 00:25:10.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.367 "is_configured": false, 00:25:10.367 "data_offset": 2048, 00:25:10.367 "data_size": 63488 00:25:10.367 }, 00:25:10.367 { 00:25:10.367 "name": "BaseBdev2", 00:25:10.367 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:10.367 "is_configured": true, 00:25:10.367 "data_offset": 2048, 00:25:10.367 "data_size": 63488 00:25:10.367 } 00:25:10.367 ] 00:25:10.367 }' 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:10.367 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:10.626 22:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:10.884 [2024-07-12 22:32:21.042389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:10.884 [2024-07-12 22:32:21.042452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:10.884 [2024-07-12 22:32:21.042479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c89140 00:25:10.884 [2024-07-12 22:32:21.042499] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:10.884 [2024-07-12 22:32:21.042876] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:10.884 [2024-07-12 22:32:21.042895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:10.884 [2024-07-12 22:32:21.042977] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:10.884 [2024-07-12 22:32:21.042991] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:10.884 [2024-07-12 22:32:21.043001] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:10.884 BaseBdev1 00:25:10.884 22:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.821 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.079 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.079 "name": "raid_bdev1", 00:25:12.079 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:12.079 "strip_size_kb": 0, 00:25:12.079 "state": "online", 00:25:12.079 "raid_level": "raid1", 00:25:12.079 "superblock": true, 00:25:12.079 "num_base_bdevs": 2, 00:25:12.079 "num_base_bdevs_discovered": 1, 00:25:12.079 "num_base_bdevs_operational": 1, 00:25:12.079 "base_bdevs_list": [ 00:25:12.079 { 00:25:12.079 "name": null, 00:25:12.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.079 "is_configured": false, 00:25:12.079 "data_offset": 2048, 00:25:12.079 "data_size": 63488 00:25:12.079 }, 00:25:12.079 { 00:25:12.079 "name": "BaseBdev2", 00:25:12.079 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:12.079 "is_configured": true, 00:25:12.079 "data_offset": 2048, 00:25:12.079 "data_size": 63488 00:25:12.079 } 00:25:12.079 ] 00:25:12.079 }' 00:25:12.079 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.079 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.646 22:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.905 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.905 "name": "raid_bdev1", 00:25:12.905 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:12.905 "strip_size_kb": 0, 00:25:12.905 "state": "online", 00:25:12.905 "raid_level": "raid1", 00:25:12.905 "superblock": true, 00:25:12.905 "num_base_bdevs": 2, 00:25:12.905 "num_base_bdevs_discovered": 1, 00:25:12.905 "num_base_bdevs_operational": 1, 00:25:12.905 "base_bdevs_list": [ 00:25:12.905 { 00:25:12.905 "name": null, 00:25:12.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.905 "is_configured": false, 00:25:12.905 "data_offset": 2048, 00:25:12.905 "data_size": 63488 00:25:12.905 }, 00:25:12.905 { 00:25:12.905 "name": "BaseBdev2", 00:25:12.905 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:12.905 "is_configured": true, 00:25:12.905 "data_offset": 2048, 00:25:12.905 "data_size": 63488 00:25:12.905 } 00:25:12.905 ] 00:25:12.905 }' 00:25:12.905 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.905 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:12.905 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:13.164 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:13.164 [2024-07-12 22:32:23.485551] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:13.164 [2024-07-12 22:32:23.485701] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:13.164 [2024-07-12 22:32:23.485716] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:13.423 request: 00:25:13.423 { 00:25:13.423 "base_bdev": "BaseBdev1", 00:25:13.423 "raid_bdev": "raid_bdev1", 00:25:13.423 "method": "bdev_raid_add_base_bdev", 00:25:13.423 "req_id": 1 00:25:13.423 } 00:25:13.423 Got JSON-RPC error response 00:25:13.423 response: 00:25:13.423 { 00:25:13.423 "code": -22, 00:25:13.423 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:13.423 } 00:25:13.423 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:25:13.423 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:13.423 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:13.423 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:13.423 22:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.361 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.621 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.621 "name": "raid_bdev1", 00:25:14.621 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:14.621 "strip_size_kb": 0, 00:25:14.621 "state": "online", 00:25:14.621 "raid_level": "raid1", 00:25:14.621 "superblock": true, 00:25:14.621 "num_base_bdevs": 2, 00:25:14.621 "num_base_bdevs_discovered": 1, 00:25:14.621 "num_base_bdevs_operational": 1, 00:25:14.621 "base_bdevs_list": [ 00:25:14.621 { 00:25:14.621 "name": null, 00:25:14.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.621 "is_configured": false, 00:25:14.621 "data_offset": 2048, 00:25:14.621 "data_size": 63488 00:25:14.621 }, 00:25:14.621 { 00:25:14.621 "name": "BaseBdev2", 00:25:14.621 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:14.621 "is_configured": true, 00:25:14.621 "data_offset": 2048, 00:25:14.621 "data_size": 63488 00:25:14.621 } 00:25:14.621 ] 00:25:14.621 }' 00:25:14.621 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.621 22:32:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.188 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.448 "name": "raid_bdev1", 00:25:15.448 "uuid": "acbec1b8-3bbe-410b-ba90-d4a82033978b", 00:25:15.448 "strip_size_kb": 0, 00:25:15.448 "state": "online", 00:25:15.448 "raid_level": "raid1", 00:25:15.448 "superblock": true, 00:25:15.448 "num_base_bdevs": 2, 00:25:15.448 "num_base_bdevs_discovered": 1, 00:25:15.448 "num_base_bdevs_operational": 1, 00:25:15.448 "base_bdevs_list": [ 00:25:15.448 { 00:25:15.448 "name": null, 00:25:15.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.448 "is_configured": false, 00:25:15.448 "data_offset": 2048, 00:25:15.448 "data_size": 63488 00:25:15.448 }, 00:25:15.448 { 00:25:15.448 "name": "BaseBdev2", 00:25:15.448 "uuid": "ed1c1e0f-d678-509e-a62e-a32de82702a5", 00:25:15.448 "is_configured": true, 00:25:15.448 "data_offset": 2048, 00:25:15.448 "data_size": 63488 00:25:15.448 } 00:25:15.448 ] 00:25:15.448 }' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 3538508 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 3538508 ']' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 3538508 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3538508 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3538508' 00:25:15.448 killing process with pid 3538508 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 3538508 00:25:15.448 Received shutdown signal, test time was about 26.575698 seconds 00:25:15.448 00:25:15.448 Latency(us) 00:25:15.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.448 =================================================================================================================== 00:25:15.448 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:15.448 [2024-07-12 22:32:25.736279] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:15.448 [2024-07-12 22:32:25.736373] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:15.448 [2024-07-12 22:32:25.736421] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:15.448 [2024-07-12 22:32:25.736433] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad6f70 name raid_bdev1, state offline 00:25:15.448 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 3538508 00:25:15.448 [2024-07-12 22:32:25.757707] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:15.707 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:15.707 00:25:15.707 real 0m31.289s 00:25:15.707 user 0m48.841s 00:25:15.707 sys 0m4.483s 00:25:15.707 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:15.707 22:32:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:15.707 ************************************ 00:25:15.707 END TEST raid_rebuild_test_sb_io 00:25:15.707 ************************************ 00:25:15.707 22:32:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:15.707 22:32:26 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:25:15.707 22:32:26 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:15.707 22:32:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:15.707 22:32:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:15.707 22:32:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:15.968 ************************************ 00:25:15.968 START TEST raid_rebuild_test 00:25:15.968 ************************************ 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=3542993 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 3542993 /var/tmp/spdk-raid.sock 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 3542993 ']' 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:15.968 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:15.968 22:32:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:15.968 [2024-07-12 22:32:26.119968] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:25:15.968 [2024-07-12 22:32:26.120034] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3542993 ] 00:25:15.968 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:15.968 Zero copy mechanism will not be used. 00:25:15.968 [2024-07-12 22:32:26.249331] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:16.228 [2024-07-12 22:32:26.355693] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:16.228 [2024-07-12 22:32:26.419219] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:16.228 [2024-07-12 22:32:26.419256] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:16.797 22:32:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:16.797 22:32:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:25:16.797 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:16.797 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:17.071 BaseBdev1_malloc 00:25:17.071 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:17.388 [2024-07-12 22:32:27.520310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:17.388 [2024-07-12 22:32:27.520359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.388 [2024-07-12 22:32:27.520384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1593d40 00:25:17.388 [2024-07-12 22:32:27.520397] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.388 [2024-07-12 22:32:27.522122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.388 [2024-07-12 22:32:27.522149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:17.388 BaseBdev1 00:25:17.388 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:17.388 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:17.647 BaseBdev2_malloc 00:25:17.647 22:32:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:17.907 [2024-07-12 22:32:27.994404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:17.907 [2024-07-12 22:32:27.994448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.907 [2024-07-12 22:32:27.994474] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1594860 00:25:17.907 [2024-07-12 22:32:27.994487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.907 [2024-07-12 22:32:27.996038] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.907 [2024-07-12 22:32:27.996065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:17.907 BaseBdev2 00:25:17.907 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:17.907 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:18.167 BaseBdev3_malloc 00:25:18.167 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:18.167 [2024-07-12 22:32:28.489562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:18.167 [2024-07-12 22:32:28.489609] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.167 [2024-07-12 22:32:28.489632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17418f0 00:25:18.167 [2024-07-12 22:32:28.489644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.167 [2024-07-12 22:32:28.491232] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.167 [2024-07-12 22:32:28.491260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:18.426 BaseBdev3 00:25:18.426 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:18.426 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:18.426 BaseBdev4_malloc 00:25:18.426 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:18.685 [2024-07-12 22:32:28.972524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:18.685 [2024-07-12 22:32:28.972569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.685 [2024-07-12 22:32:28.972590] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1740ad0 00:25:18.685 [2024-07-12 22:32:28.972602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.685 [2024-07-12 22:32:28.974168] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.685 [2024-07-12 22:32:28.974197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:18.685 BaseBdev4 00:25:18.685 22:32:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:18.944 spare_malloc 00:25:18.944 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:19.203 spare_delay 00:25:19.203 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:19.462 [2024-07-12 22:32:29.698998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:19.462 [2024-07-12 22:32:29.699045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.462 [2024-07-12 22:32:29.699066] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17455b0 00:25:19.462 [2024-07-12 22:32:29.699078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.462 [2024-07-12 22:32:29.700666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.462 [2024-07-12 22:32:29.700694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:19.462 spare 00:25:19.462 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:19.721 [2024-07-12 22:32:29.939654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:19.721 [2024-07-12 22:32:29.941005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:19.721 [2024-07-12 22:32:29.941062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:19.721 [2024-07-12 22:32:29.941107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:19.721 [2024-07-12 22:32:29.941193] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c48a0 00:25:19.721 [2024-07-12 22:32:29.941204] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:19.721 [2024-07-12 22:32:29.941426] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173ee10 00:25:19.721 [2024-07-12 22:32:29.941575] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c48a0 00:25:19.721 [2024-07-12 22:32:29.941585] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c48a0 00:25:19.721 [2024-07-12 22:32:29.941704] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.721 22:32:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.981 22:32:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:19.981 "name": "raid_bdev1", 00:25:19.981 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:19.981 "strip_size_kb": 0, 00:25:19.981 "state": "online", 00:25:19.981 "raid_level": "raid1", 00:25:19.981 "superblock": false, 00:25:19.981 "num_base_bdevs": 4, 00:25:19.981 "num_base_bdevs_discovered": 4, 00:25:19.981 "num_base_bdevs_operational": 4, 00:25:19.981 "base_bdevs_list": [ 00:25:19.981 { 00:25:19.981 "name": "BaseBdev1", 00:25:19.981 "uuid": "d2af516c-09d4-50b9-a5f9-f17c615b1ff6", 00:25:19.981 "is_configured": true, 00:25:19.981 "data_offset": 0, 00:25:19.981 "data_size": 65536 00:25:19.981 }, 00:25:19.981 { 00:25:19.981 "name": "BaseBdev2", 00:25:19.981 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:19.981 "is_configured": true, 00:25:19.981 "data_offset": 0, 00:25:19.981 "data_size": 65536 00:25:19.981 }, 00:25:19.981 { 00:25:19.981 "name": "BaseBdev3", 00:25:19.981 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:19.981 "is_configured": true, 00:25:19.981 "data_offset": 0, 00:25:19.981 "data_size": 65536 00:25:19.981 }, 00:25:19.981 { 00:25:19.981 "name": "BaseBdev4", 00:25:19.981 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:19.981 "is_configured": true, 00:25:19.981 "data_offset": 0, 00:25:19.981 "data_size": 65536 00:25:19.981 } 00:25:19.981 ] 00:25:19.981 }' 00:25:19.981 22:32:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:19.981 22:32:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:20.549 22:32:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:20.549 22:32:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:20.807 [2024-07-12 22:32:31.030820] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:20.807 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:20.807 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.807 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.065 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:21.324 [2024-07-12 22:32:31.403544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173ee10 00:25:21.324 /dev/nbd0 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:21.324 1+0 records in 00:25:21.324 1+0 records out 00:25:21.324 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273415 s, 15.0 MB/s 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:21.324 22:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:29.431 65536+0 records in 00:25:29.431 65536+0 records out 00:25:29.431 33554432 bytes (34 MB, 32 MiB) copied, 7.50708 s, 4.5 MB/s 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:29.431 22:32:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:29.431 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:29.431 [2024-07-12 22:32:39.236408] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.431 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:29.431 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:29.431 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:29.432 [2024-07-12 22:32:39.465058] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:29.432 "name": "raid_bdev1", 00:25:29.432 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:29.432 "strip_size_kb": 0, 00:25:29.432 "state": "online", 00:25:29.432 "raid_level": "raid1", 00:25:29.432 "superblock": false, 00:25:29.432 "num_base_bdevs": 4, 00:25:29.432 "num_base_bdevs_discovered": 3, 00:25:29.432 "num_base_bdevs_operational": 3, 00:25:29.432 "base_bdevs_list": [ 00:25:29.432 { 00:25:29.432 "name": null, 00:25:29.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.432 "is_configured": false, 00:25:29.432 "data_offset": 0, 00:25:29.432 "data_size": 65536 00:25:29.432 }, 00:25:29.432 { 00:25:29.432 "name": "BaseBdev2", 00:25:29.432 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:29.432 "is_configured": true, 00:25:29.432 "data_offset": 0, 00:25:29.432 "data_size": 65536 00:25:29.432 }, 00:25:29.432 { 00:25:29.432 "name": "BaseBdev3", 00:25:29.432 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:29.432 "is_configured": true, 00:25:29.432 "data_offset": 0, 00:25:29.432 "data_size": 65536 00:25:29.432 }, 00:25:29.432 { 00:25:29.432 "name": "BaseBdev4", 00:25:29.432 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:29.432 "is_configured": true, 00:25:29.432 "data_offset": 0, 00:25:29.432 "data_size": 65536 00:25:29.432 } 00:25:29.432 ] 00:25:29.432 }' 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:29.432 22:32:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:30.366 22:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:30.366 [2024-07-12 22:32:40.556107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:30.366 [2024-07-12 22:32:40.560248] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ca6b0 00:25:30.366 [2024-07-12 22:32:40.562619] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:30.366 22:32:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.300 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.558 "name": "raid_bdev1", 00:25:31.558 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:31.558 "strip_size_kb": 0, 00:25:31.558 "state": "online", 00:25:31.558 "raid_level": "raid1", 00:25:31.558 "superblock": false, 00:25:31.558 "num_base_bdevs": 4, 00:25:31.558 "num_base_bdevs_discovered": 4, 00:25:31.558 "num_base_bdevs_operational": 4, 00:25:31.558 "process": { 00:25:31.558 "type": "rebuild", 00:25:31.558 "target": "spare", 00:25:31.558 "progress": { 00:25:31.558 "blocks": 22528, 00:25:31.558 "percent": 34 00:25:31.558 } 00:25:31.558 }, 00:25:31.558 "base_bdevs_list": [ 00:25:31.558 { 00:25:31.558 "name": "spare", 00:25:31.558 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:31.558 "is_configured": true, 00:25:31.558 "data_offset": 0, 00:25:31.558 "data_size": 65536 00:25:31.558 }, 00:25:31.558 { 00:25:31.558 "name": "BaseBdev2", 00:25:31.558 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:31.558 "is_configured": true, 00:25:31.558 "data_offset": 0, 00:25:31.558 "data_size": 65536 00:25:31.558 }, 00:25:31.558 { 00:25:31.558 "name": "BaseBdev3", 00:25:31.558 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:31.558 "is_configured": true, 00:25:31.558 "data_offset": 0, 00:25:31.558 "data_size": 65536 00:25:31.558 }, 00:25:31.558 { 00:25:31.558 "name": "BaseBdev4", 00:25:31.558 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:31.558 "is_configured": true, 00:25:31.558 "data_offset": 0, 00:25:31.558 "data_size": 65536 00:25:31.558 } 00:25:31.558 ] 00:25:31.558 }' 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.558 22:32:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:31.816 [2024-07-12 22:32:42.085210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:32.074 [2024-07-12 22:32:42.175278] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:32.074 [2024-07-12 22:32:42.175325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.074 [2024-07-12 22:32:42.175343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:32.074 [2024-07-12 22:32:42.175351] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.074 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.075 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.333 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.333 "name": "raid_bdev1", 00:25:32.333 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:32.333 "strip_size_kb": 0, 00:25:32.333 "state": "online", 00:25:32.333 "raid_level": "raid1", 00:25:32.333 "superblock": false, 00:25:32.333 "num_base_bdevs": 4, 00:25:32.333 "num_base_bdevs_discovered": 3, 00:25:32.333 "num_base_bdevs_operational": 3, 00:25:32.333 "base_bdevs_list": [ 00:25:32.333 { 00:25:32.333 "name": null, 00:25:32.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.333 "is_configured": false, 00:25:32.333 "data_offset": 0, 00:25:32.333 "data_size": 65536 00:25:32.333 }, 00:25:32.333 { 00:25:32.333 "name": "BaseBdev2", 00:25:32.333 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:32.333 "is_configured": true, 00:25:32.333 "data_offset": 0, 00:25:32.333 "data_size": 65536 00:25:32.333 }, 00:25:32.333 { 00:25:32.333 "name": "BaseBdev3", 00:25:32.333 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:32.333 "is_configured": true, 00:25:32.333 "data_offset": 0, 00:25:32.333 "data_size": 65536 00:25:32.333 }, 00:25:32.333 { 00:25:32.333 "name": "BaseBdev4", 00:25:32.333 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:32.333 "is_configured": true, 00:25:32.333 "data_offset": 0, 00:25:32.333 "data_size": 65536 00:25:32.333 } 00:25:32.333 ] 00:25:32.333 }' 00:25:32.333 22:32:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.333 22:32:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.897 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.154 "name": "raid_bdev1", 00:25:33.154 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:33.154 "strip_size_kb": 0, 00:25:33.154 "state": "online", 00:25:33.154 "raid_level": "raid1", 00:25:33.154 "superblock": false, 00:25:33.154 "num_base_bdevs": 4, 00:25:33.154 "num_base_bdevs_discovered": 3, 00:25:33.154 "num_base_bdevs_operational": 3, 00:25:33.154 "base_bdevs_list": [ 00:25:33.154 { 00:25:33.154 "name": null, 00:25:33.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.154 "is_configured": false, 00:25:33.154 "data_offset": 0, 00:25:33.154 "data_size": 65536 00:25:33.154 }, 00:25:33.154 { 00:25:33.154 "name": "BaseBdev2", 00:25:33.154 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:33.154 "is_configured": true, 00:25:33.154 "data_offset": 0, 00:25:33.154 "data_size": 65536 00:25:33.154 }, 00:25:33.154 { 00:25:33.154 "name": "BaseBdev3", 00:25:33.154 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:33.154 "is_configured": true, 00:25:33.154 "data_offset": 0, 00:25:33.154 "data_size": 65536 00:25:33.154 }, 00:25:33.154 { 00:25:33.154 "name": "BaseBdev4", 00:25:33.154 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:33.154 "is_configured": true, 00:25:33.154 "data_offset": 0, 00:25:33.154 "data_size": 65536 00:25:33.154 } 00:25:33.154 ] 00:25:33.154 }' 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:33.154 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:33.411 [2024-07-12 22:32:43.595156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:33.411 [2024-07-12 22:32:43.599185] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ca6b0 00:25:33.411 [2024-07-12 22:32:43.600685] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:33.411 22:32:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:34.342 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.343 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.600 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.600 "name": "raid_bdev1", 00:25:34.600 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:34.600 "strip_size_kb": 0, 00:25:34.600 "state": "online", 00:25:34.600 "raid_level": "raid1", 00:25:34.600 "superblock": false, 00:25:34.600 "num_base_bdevs": 4, 00:25:34.600 "num_base_bdevs_discovered": 4, 00:25:34.600 "num_base_bdevs_operational": 4, 00:25:34.600 "process": { 00:25:34.600 "type": "rebuild", 00:25:34.600 "target": "spare", 00:25:34.600 "progress": { 00:25:34.600 "blocks": 24576, 00:25:34.600 "percent": 37 00:25:34.600 } 00:25:34.600 }, 00:25:34.600 "base_bdevs_list": [ 00:25:34.600 { 00:25:34.600 "name": "spare", 00:25:34.600 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:34.600 "is_configured": true, 00:25:34.600 "data_offset": 0, 00:25:34.601 "data_size": 65536 00:25:34.601 }, 00:25:34.601 { 00:25:34.601 "name": "BaseBdev2", 00:25:34.601 "uuid": "12793085-26bc-56ce-a470-db093635c696", 00:25:34.601 "is_configured": true, 00:25:34.601 "data_offset": 0, 00:25:34.601 "data_size": 65536 00:25:34.601 }, 00:25:34.601 { 00:25:34.601 "name": "BaseBdev3", 00:25:34.601 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:34.601 "is_configured": true, 00:25:34.601 "data_offset": 0, 00:25:34.601 "data_size": 65536 00:25:34.601 }, 00:25:34.601 { 00:25:34.601 "name": "BaseBdev4", 00:25:34.601 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:34.601 "is_configured": true, 00:25:34.601 "data_offset": 0, 00:25:34.601 "data_size": 65536 00:25:34.601 } 00:25:34.601 ] 00:25:34.601 }' 00:25:34.601 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.601 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:34.601 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:34.858 22:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:35.114 [2024-07-12 22:32:45.189226] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:35.114 [2024-07-12 22:32:45.213248] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x16ca6b0 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.114 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.372 "name": "raid_bdev1", 00:25:35.372 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:35.372 "strip_size_kb": 0, 00:25:35.372 "state": "online", 00:25:35.372 "raid_level": "raid1", 00:25:35.372 "superblock": false, 00:25:35.372 "num_base_bdevs": 4, 00:25:35.372 "num_base_bdevs_discovered": 3, 00:25:35.372 "num_base_bdevs_operational": 3, 00:25:35.372 "process": { 00:25:35.372 "type": "rebuild", 00:25:35.372 "target": "spare", 00:25:35.372 "progress": { 00:25:35.372 "blocks": 36864, 00:25:35.372 "percent": 56 00:25:35.372 } 00:25:35.372 }, 00:25:35.372 "base_bdevs_list": [ 00:25:35.372 { 00:25:35.372 "name": "spare", 00:25:35.372 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:35.372 "is_configured": true, 00:25:35.372 "data_offset": 0, 00:25:35.372 "data_size": 65536 00:25:35.372 }, 00:25:35.372 { 00:25:35.372 "name": null, 00:25:35.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.372 "is_configured": false, 00:25:35.372 "data_offset": 0, 00:25:35.372 "data_size": 65536 00:25:35.372 }, 00:25:35.372 { 00:25:35.372 "name": "BaseBdev3", 00:25:35.372 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:35.372 "is_configured": true, 00:25:35.372 "data_offset": 0, 00:25:35.372 "data_size": 65536 00:25:35.372 }, 00:25:35.372 { 00:25:35.372 "name": "BaseBdev4", 00:25:35.372 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:35.372 "is_configured": true, 00:25:35.372 "data_offset": 0, 00:25:35.372 "data_size": 65536 00:25:35.372 } 00:25:35.372 ] 00:25:35.372 }' 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=869 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.372 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.629 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.629 "name": "raid_bdev1", 00:25:35.629 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:35.629 "strip_size_kb": 0, 00:25:35.629 "state": "online", 00:25:35.629 "raid_level": "raid1", 00:25:35.629 "superblock": false, 00:25:35.629 "num_base_bdevs": 4, 00:25:35.629 "num_base_bdevs_discovered": 3, 00:25:35.629 "num_base_bdevs_operational": 3, 00:25:35.629 "process": { 00:25:35.629 "type": "rebuild", 00:25:35.629 "target": "spare", 00:25:35.630 "progress": { 00:25:35.630 "blocks": 43008, 00:25:35.630 "percent": 65 00:25:35.630 } 00:25:35.630 }, 00:25:35.630 "base_bdevs_list": [ 00:25:35.630 { 00:25:35.630 "name": "spare", 00:25:35.630 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:35.630 "is_configured": true, 00:25:35.630 "data_offset": 0, 00:25:35.630 "data_size": 65536 00:25:35.630 }, 00:25:35.630 { 00:25:35.630 "name": null, 00:25:35.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.630 "is_configured": false, 00:25:35.630 "data_offset": 0, 00:25:35.630 "data_size": 65536 00:25:35.630 }, 00:25:35.630 { 00:25:35.630 "name": "BaseBdev3", 00:25:35.630 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:35.630 "is_configured": true, 00:25:35.630 "data_offset": 0, 00:25:35.630 "data_size": 65536 00:25:35.630 }, 00:25:35.630 { 00:25:35.630 "name": "BaseBdev4", 00:25:35.630 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:35.630 "is_configured": true, 00:25:35.630 "data_offset": 0, 00:25:35.630 "data_size": 65536 00:25:35.630 } 00:25:35.630 ] 00:25:35.630 }' 00:25:35.630 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.630 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:35.630 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.630 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:35.630 22:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:36.561 [2024-07-12 22:32:46.825732] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:36.561 [2024-07-12 22:32:46.825793] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:36.561 [2024-07-12 22:32:46.825831] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:36.561 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.818 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.818 22:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.818 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.818 "name": "raid_bdev1", 00:25:36.818 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:36.818 "strip_size_kb": 0, 00:25:36.818 "state": "online", 00:25:36.818 "raid_level": "raid1", 00:25:36.818 "superblock": false, 00:25:36.818 "num_base_bdevs": 4, 00:25:36.818 "num_base_bdevs_discovered": 3, 00:25:36.818 "num_base_bdevs_operational": 3, 00:25:36.818 "base_bdevs_list": [ 00:25:36.818 { 00:25:36.818 "name": "spare", 00:25:36.818 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:36.818 "is_configured": true, 00:25:36.818 "data_offset": 0, 00:25:36.818 "data_size": 65536 00:25:36.818 }, 00:25:36.818 { 00:25:36.818 "name": null, 00:25:36.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.818 "is_configured": false, 00:25:36.818 "data_offset": 0, 00:25:36.818 "data_size": 65536 00:25:36.818 }, 00:25:36.818 { 00:25:36.818 "name": "BaseBdev3", 00:25:36.818 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:36.818 "is_configured": true, 00:25:36.818 "data_offset": 0, 00:25:36.818 "data_size": 65536 00:25:36.818 }, 00:25:36.818 { 00:25:36.818 "name": "BaseBdev4", 00:25:36.818 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:36.818 "is_configured": true, 00:25:36.818 "data_offset": 0, 00:25:36.818 "data_size": 65536 00:25:36.818 } 00:25:36.818 ] 00:25:36.818 }' 00:25:36.818 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.076 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.333 "name": "raid_bdev1", 00:25:37.333 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:37.333 "strip_size_kb": 0, 00:25:37.333 "state": "online", 00:25:37.333 "raid_level": "raid1", 00:25:37.333 "superblock": false, 00:25:37.333 "num_base_bdevs": 4, 00:25:37.333 "num_base_bdevs_discovered": 3, 00:25:37.333 "num_base_bdevs_operational": 3, 00:25:37.333 "base_bdevs_list": [ 00:25:37.333 { 00:25:37.333 "name": "spare", 00:25:37.333 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:37.333 "is_configured": true, 00:25:37.333 "data_offset": 0, 00:25:37.333 "data_size": 65536 00:25:37.333 }, 00:25:37.333 { 00:25:37.333 "name": null, 00:25:37.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.333 "is_configured": false, 00:25:37.333 "data_offset": 0, 00:25:37.333 "data_size": 65536 00:25:37.333 }, 00:25:37.333 { 00:25:37.333 "name": "BaseBdev3", 00:25:37.333 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:37.333 "is_configured": true, 00:25:37.333 "data_offset": 0, 00:25:37.333 "data_size": 65536 00:25:37.333 }, 00:25:37.333 { 00:25:37.333 "name": "BaseBdev4", 00:25:37.333 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:37.333 "is_configured": true, 00:25:37.333 "data_offset": 0, 00:25:37.333 "data_size": 65536 00:25:37.333 } 00:25:37.333 ] 00:25:37.333 }' 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.333 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.591 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.591 "name": "raid_bdev1", 00:25:37.591 "uuid": "6f6d7fc4-fcc3-431b-9fba-7b73bd897393", 00:25:37.591 "strip_size_kb": 0, 00:25:37.591 "state": "online", 00:25:37.591 "raid_level": "raid1", 00:25:37.591 "superblock": false, 00:25:37.591 "num_base_bdevs": 4, 00:25:37.591 "num_base_bdevs_discovered": 3, 00:25:37.591 "num_base_bdevs_operational": 3, 00:25:37.591 "base_bdevs_list": [ 00:25:37.591 { 00:25:37.591 "name": "spare", 00:25:37.591 "uuid": "f84bc3a8-2a3f-571e-ba38-c9b649eb7a4f", 00:25:37.591 "is_configured": true, 00:25:37.591 "data_offset": 0, 00:25:37.591 "data_size": 65536 00:25:37.591 }, 00:25:37.591 { 00:25:37.591 "name": null, 00:25:37.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.591 "is_configured": false, 00:25:37.591 "data_offset": 0, 00:25:37.591 "data_size": 65536 00:25:37.591 }, 00:25:37.591 { 00:25:37.591 "name": "BaseBdev3", 00:25:37.591 "uuid": "b2792db0-e077-525e-98ff-f91e6398c6dc", 00:25:37.591 "is_configured": true, 00:25:37.591 "data_offset": 0, 00:25:37.591 "data_size": 65536 00:25:37.591 }, 00:25:37.591 { 00:25:37.591 "name": "BaseBdev4", 00:25:37.591 "uuid": "cdbeeddc-4ae7-5af1-aa5d-184498054f97", 00:25:37.591 "is_configured": true, 00:25:37.591 "data_offset": 0, 00:25:37.591 "data_size": 65536 00:25:37.591 } 00:25:37.591 ] 00:25:37.591 }' 00:25:37.591 22:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.591 22:32:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:38.155 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:38.413 [2024-07-12 22:32:48.618311] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:38.413 [2024-07-12 22:32:48.618341] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:38.413 [2024-07-12 22:32:48.618404] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:38.413 [2024-07-12 22:32:48.618471] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:38.413 [2024-07-12 22:32:48.618484] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c48a0 name raid_bdev1, state offline 00:25:38.413 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.413 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:38.672 22:32:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:38.930 /dev/nbd0 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:38.930 1+0 records in 00:25:38.930 1+0 records out 00:25:38.930 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237878 s, 17.2 MB/s 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:38.930 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:39.189 /dev/nbd1 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:39.189 1+0 records in 00:25:39.189 1+0 records out 00:25:39.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331409 s, 12.4 MB/s 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.189 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:39.448 22:32:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:39.736 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 3542993 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 3542993 ']' 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 3542993 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3542993 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3542993' 00:25:40.006 killing process with pid 3542993 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 3542993 00:25:40.006 Received shutdown signal, test time was about 60.000000 seconds 00:25:40.006 00:25:40.006 Latency(us) 00:25:40.006 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:40.006 =================================================================================================================== 00:25:40.006 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:40.006 [2024-07-12 22:32:50.082637] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:40.006 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 3542993 00:25:40.006 [2024-07-12 22:32:50.131676] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:40.265 00:25:40.265 real 0m24.305s 00:25:40.265 user 0m32.592s 00:25:40.265 sys 0m5.308s 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:40.265 ************************************ 00:25:40.265 END TEST raid_rebuild_test 00:25:40.265 ************************************ 00:25:40.265 22:32:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:40.265 22:32:50 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:40.265 22:32:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:40.265 22:32:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:40.265 22:32:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:40.265 ************************************ 00:25:40.265 START TEST raid_rebuild_test_sb 00:25:40.265 ************************************ 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:40.265 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=3546395 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 3546395 /var/tmp/spdk-raid.sock 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 3546395 ']' 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:40.266 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:40.266 22:32:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:40.266 [2024-07-12 22:32:50.498520] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:25:40.266 [2024-07-12 22:32:50.498587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3546395 ] 00:25:40.266 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:40.266 Zero copy mechanism will not be used. 00:25:40.524 [2024-07-12 22:32:50.628615] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:40.524 [2024-07-12 22:32:50.735094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:40.524 [2024-07-12 22:32:50.801319] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:40.524 [2024-07-12 22:32:50.801351] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:41.091 22:32:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:41.091 22:32:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:41.091 22:32:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:41.091 22:32:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:41.350 BaseBdev1_malloc 00:25:41.350 22:32:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:41.609 [2024-07-12 22:32:51.874912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:41.609 [2024-07-12 22:32:51.874966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.609 [2024-07-12 22:32:51.874992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f7d40 00:25:41.609 [2024-07-12 22:32:51.875004] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.609 [2024-07-12 22:32:51.876838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.609 [2024-07-12 22:32:51.876867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:41.609 BaseBdev1 00:25:41.609 22:32:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:41.609 22:32:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:42.177 BaseBdev2_malloc 00:25:42.177 22:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:42.435 [2024-07-12 22:32:52.631030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:42.436 [2024-07-12 22:32:52.631082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.436 [2024-07-12 22:32:52.631108] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f8860 00:25:42.436 [2024-07-12 22:32:52.631120] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.436 [2024-07-12 22:32:52.632725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.436 [2024-07-12 22:32:52.632755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:42.436 BaseBdev2 00:25:42.436 22:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:42.436 22:32:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:43.033 BaseBdev3_malloc 00:25:43.033 22:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:43.291 [2024-07-12 22:32:53.381556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:43.291 [2024-07-12 22:32:53.381603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.291 [2024-07-12 22:32:53.381624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a58f0 00:25:43.291 [2024-07-12 22:32:53.381637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.292 [2024-07-12 22:32:53.383174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.292 [2024-07-12 22:32:53.383202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:43.292 BaseBdev3 00:25:43.292 22:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:43.292 22:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:43.550 BaseBdev4_malloc 00:25:43.550 22:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:43.550 [2024-07-12 22:32:53.864388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:43.550 [2024-07-12 22:32:53.864434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.550 [2024-07-12 22:32:53.864456] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a4ad0 00:25:43.550 [2024-07-12 22:32:53.864468] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.550 [2024-07-12 22:32:53.866036] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.550 [2024-07-12 22:32:53.866065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:43.550 BaseBdev4 00:25:43.808 22:32:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:43.808 spare_malloc 00:25:43.808 22:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:44.066 spare_delay 00:25:44.066 22:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:44.632 [2024-07-12 22:32:54.844818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:44.632 [2024-07-12 22:32:54.844866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.632 [2024-07-12 22:32:54.844887] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a95b0 00:25:44.632 [2024-07-12 22:32:54.844900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.632 [2024-07-12 22:32:54.846488] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.632 [2024-07-12 22:32:54.846516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:44.632 spare 00:25:44.632 22:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:44.891 [2024-07-12 22:32:55.089490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:44.891 [2024-07-12 22:32:55.090803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:44.891 [2024-07-12 22:32:55.090860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:44.891 [2024-07-12 22:32:55.090906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:44.891 [2024-07-12 22:32:55.091116] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15288a0 00:25:44.891 [2024-07-12 22:32:55.091128] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:44.891 [2024-07-12 22:32:55.091334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a2e10 00:25:44.891 [2024-07-12 22:32:55.091486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15288a0 00:25:44.891 [2024-07-12 22:32:55.091496] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15288a0 00:25:44.891 [2024-07-12 22:32:55.091593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.891 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.149 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.149 "name": "raid_bdev1", 00:25:45.149 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:45.149 "strip_size_kb": 0, 00:25:45.149 "state": "online", 00:25:45.149 "raid_level": "raid1", 00:25:45.149 "superblock": true, 00:25:45.149 "num_base_bdevs": 4, 00:25:45.149 "num_base_bdevs_discovered": 4, 00:25:45.149 "num_base_bdevs_operational": 4, 00:25:45.149 "base_bdevs_list": [ 00:25:45.149 { 00:25:45.149 "name": "BaseBdev1", 00:25:45.149 "uuid": "cd189969-5073-5b21-9397-6703a0e93a80", 00:25:45.149 "is_configured": true, 00:25:45.149 "data_offset": 2048, 00:25:45.149 "data_size": 63488 00:25:45.149 }, 00:25:45.149 { 00:25:45.149 "name": "BaseBdev2", 00:25:45.149 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:45.149 "is_configured": true, 00:25:45.149 "data_offset": 2048, 00:25:45.149 "data_size": 63488 00:25:45.149 }, 00:25:45.149 { 00:25:45.149 "name": "BaseBdev3", 00:25:45.149 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:45.149 "is_configured": true, 00:25:45.149 "data_offset": 2048, 00:25:45.149 "data_size": 63488 00:25:45.149 }, 00:25:45.149 { 00:25:45.149 "name": "BaseBdev4", 00:25:45.149 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:45.149 "is_configured": true, 00:25:45.149 "data_offset": 2048, 00:25:45.149 "data_size": 63488 00:25:45.149 } 00:25:45.149 ] 00:25:45.149 }' 00:25:45.149 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.149 22:32:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:45.716 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:45.716 22:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:45.974 [2024-07-12 22:32:56.076371] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:45.974 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:45.974 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.974 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.233 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:46.491 [2024-07-12 22:32:56.573448] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a2e10 00:25:46.491 /dev/nbd0 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.491 1+0 records in 00:25:46.491 1+0 records out 00:25:46.491 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250993 s, 16.3 MB/s 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:46.491 22:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:53.047 63488+0 records in 00:25:53.047 63488+0 records out 00:25:53.047 32505856 bytes (33 MB, 31 MiB) copied, 6.09838 s, 5.3 MB/s 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:53.047 22:33:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:25:53.047 [2024-07-12 22:33:03.035834] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:53.047 [2024-07-12 22:33:03.216359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.047 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.306 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.306 "name": "raid_bdev1", 00:25:53.306 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:53.306 "strip_size_kb": 0, 00:25:53.306 "state": "online", 00:25:53.306 "raid_level": "raid1", 00:25:53.306 "superblock": true, 00:25:53.306 "num_base_bdevs": 4, 00:25:53.306 "num_base_bdevs_discovered": 3, 00:25:53.306 "num_base_bdevs_operational": 3, 00:25:53.306 "base_bdevs_list": [ 00:25:53.306 { 00:25:53.306 "name": null, 00:25:53.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.306 "is_configured": false, 00:25:53.306 "data_offset": 2048, 00:25:53.306 "data_size": 63488 00:25:53.306 }, 00:25:53.306 { 00:25:53.306 "name": "BaseBdev2", 00:25:53.306 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:53.306 "is_configured": true, 00:25:53.306 "data_offset": 2048, 00:25:53.306 "data_size": 63488 00:25:53.306 }, 00:25:53.306 { 00:25:53.306 "name": "BaseBdev3", 00:25:53.306 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:53.306 "is_configured": true, 00:25:53.306 "data_offset": 2048, 00:25:53.306 "data_size": 63488 00:25:53.306 }, 00:25:53.306 { 00:25:53.306 "name": "BaseBdev4", 00:25:53.306 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:53.306 "is_configured": true, 00:25:53.306 "data_offset": 2048, 00:25:53.306 "data_size": 63488 00:25:53.306 } 00:25:53.306 ] 00:25:53.306 }' 00:25:53.306 22:33:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.306 22:33:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:53.874 22:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:53.874 [2024-07-12 22:33:04.166880] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:53.874 [2024-07-12 22:33:04.171004] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a2e10 00:25:53.874 [2024-07-12 22:33:04.173358] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:53.874 22:33:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.252 "name": "raid_bdev1", 00:25:55.252 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:55.252 "strip_size_kb": 0, 00:25:55.252 "state": "online", 00:25:55.252 "raid_level": "raid1", 00:25:55.252 "superblock": true, 00:25:55.252 "num_base_bdevs": 4, 00:25:55.252 "num_base_bdevs_discovered": 4, 00:25:55.252 "num_base_bdevs_operational": 4, 00:25:55.252 "process": { 00:25:55.252 "type": "rebuild", 00:25:55.252 "target": "spare", 00:25:55.252 "progress": { 00:25:55.252 "blocks": 22528, 00:25:55.252 "percent": 35 00:25:55.252 } 00:25:55.252 }, 00:25:55.252 "base_bdevs_list": [ 00:25:55.252 { 00:25:55.252 "name": "spare", 00:25:55.252 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:25:55.252 "is_configured": true, 00:25:55.252 "data_offset": 2048, 00:25:55.252 "data_size": 63488 00:25:55.252 }, 00:25:55.252 { 00:25:55.252 "name": "BaseBdev2", 00:25:55.252 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:55.252 "is_configured": true, 00:25:55.252 "data_offset": 2048, 00:25:55.252 "data_size": 63488 00:25:55.252 }, 00:25:55.252 { 00:25:55.252 "name": "BaseBdev3", 00:25:55.252 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:55.252 "is_configured": true, 00:25:55.252 "data_offset": 2048, 00:25:55.252 "data_size": 63488 00:25:55.252 }, 00:25:55.252 { 00:25:55.252 "name": "BaseBdev4", 00:25:55.252 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:55.252 "is_configured": true, 00:25:55.252 "data_offset": 2048, 00:25:55.252 "data_size": 63488 00:25:55.252 } 00:25:55.252 ] 00:25:55.252 }' 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:55.252 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:55.511 [2024-07-12 22:33:05.668417] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.511 [2024-07-12 22:33:05.685478] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:55.512 [2024-07-12 22:33:05.685521] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.512 [2024-07-12 22:33:05.685540] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.512 [2024-07-12 22:33:05.685549] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.512 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.770 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.770 "name": "raid_bdev1", 00:25:55.770 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:55.770 "strip_size_kb": 0, 00:25:55.770 "state": "online", 00:25:55.770 "raid_level": "raid1", 00:25:55.770 "superblock": true, 00:25:55.770 "num_base_bdevs": 4, 00:25:55.770 "num_base_bdevs_discovered": 3, 00:25:55.770 "num_base_bdevs_operational": 3, 00:25:55.770 "base_bdevs_list": [ 00:25:55.770 { 00:25:55.770 "name": null, 00:25:55.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.771 "is_configured": false, 00:25:55.771 "data_offset": 2048, 00:25:55.771 "data_size": 63488 00:25:55.771 }, 00:25:55.771 { 00:25:55.771 "name": "BaseBdev2", 00:25:55.771 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:55.771 "is_configured": true, 00:25:55.771 "data_offset": 2048, 00:25:55.771 "data_size": 63488 00:25:55.771 }, 00:25:55.771 { 00:25:55.771 "name": "BaseBdev3", 00:25:55.771 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:55.771 "is_configured": true, 00:25:55.771 "data_offset": 2048, 00:25:55.771 "data_size": 63488 00:25:55.771 }, 00:25:55.771 { 00:25:55.771 "name": "BaseBdev4", 00:25:55.771 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:55.771 "is_configured": true, 00:25:55.771 "data_offset": 2048, 00:25:55.771 "data_size": 63488 00:25:55.771 } 00:25:55.771 ] 00:25:55.771 }' 00:25:55.771 22:33:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.771 22:33:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.339 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.598 "name": "raid_bdev1", 00:25:56.598 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:56.598 "strip_size_kb": 0, 00:25:56.598 "state": "online", 00:25:56.598 "raid_level": "raid1", 00:25:56.598 "superblock": true, 00:25:56.598 "num_base_bdevs": 4, 00:25:56.598 "num_base_bdevs_discovered": 3, 00:25:56.598 "num_base_bdevs_operational": 3, 00:25:56.598 "base_bdevs_list": [ 00:25:56.598 { 00:25:56.598 "name": null, 00:25:56.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.598 "is_configured": false, 00:25:56.598 "data_offset": 2048, 00:25:56.598 "data_size": 63488 00:25:56.598 }, 00:25:56.598 { 00:25:56.598 "name": "BaseBdev2", 00:25:56.598 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:56.598 "is_configured": true, 00:25:56.598 "data_offset": 2048, 00:25:56.598 "data_size": 63488 00:25:56.598 }, 00:25:56.598 { 00:25:56.598 "name": "BaseBdev3", 00:25:56.598 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:56.598 "is_configured": true, 00:25:56.598 "data_offset": 2048, 00:25:56.598 "data_size": 63488 00:25:56.598 }, 00:25:56.598 { 00:25:56.598 "name": "BaseBdev4", 00:25:56.598 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:56.598 "is_configured": true, 00:25:56.598 "data_offset": 2048, 00:25:56.598 "data_size": 63488 00:25:56.598 } 00:25:56.598 ] 00:25:56.598 }' 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:56.598 22:33:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:56.871 [2024-07-12 22:33:07.033746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:56.871 [2024-07-12 22:33:07.037867] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fefa0 00:25:56.871 [2024-07-12 22:33:07.039375] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:56.871 22:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.810 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.067 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.067 "name": "raid_bdev1", 00:25:58.067 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:58.067 "strip_size_kb": 0, 00:25:58.067 "state": "online", 00:25:58.067 "raid_level": "raid1", 00:25:58.067 "superblock": true, 00:25:58.067 "num_base_bdevs": 4, 00:25:58.067 "num_base_bdevs_discovered": 4, 00:25:58.067 "num_base_bdevs_operational": 4, 00:25:58.067 "process": { 00:25:58.067 "type": "rebuild", 00:25:58.067 "target": "spare", 00:25:58.067 "progress": { 00:25:58.067 "blocks": 24576, 00:25:58.067 "percent": 38 00:25:58.067 } 00:25:58.067 }, 00:25:58.067 "base_bdevs_list": [ 00:25:58.067 { 00:25:58.067 "name": "spare", 00:25:58.067 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:25:58.067 "is_configured": true, 00:25:58.067 "data_offset": 2048, 00:25:58.067 "data_size": 63488 00:25:58.067 }, 00:25:58.067 { 00:25:58.067 "name": "BaseBdev2", 00:25:58.067 "uuid": "9ff0209c-c1b4-5b65-aae2-f60688e9d145", 00:25:58.067 "is_configured": true, 00:25:58.067 "data_offset": 2048, 00:25:58.067 "data_size": 63488 00:25:58.067 }, 00:25:58.067 { 00:25:58.067 "name": "BaseBdev3", 00:25:58.067 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:58.067 "is_configured": true, 00:25:58.067 "data_offset": 2048, 00:25:58.067 "data_size": 63488 00:25:58.067 }, 00:25:58.067 { 00:25:58.067 "name": "BaseBdev4", 00:25:58.067 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:58.067 "is_configured": true, 00:25:58.067 "data_offset": 2048, 00:25:58.067 "data_size": 63488 00:25:58.067 } 00:25:58.067 ] 00:25:58.067 }' 00:25:58.067 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.067 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.067 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:58.326 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:58.326 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:58.326 [2024-07-12 22:33:08.631017] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:58.585 [2024-07-12 22:33:08.752210] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x10fefa0 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.585 22:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.844 "name": "raid_bdev1", 00:25:58.844 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:58.844 "strip_size_kb": 0, 00:25:58.844 "state": "online", 00:25:58.844 "raid_level": "raid1", 00:25:58.844 "superblock": true, 00:25:58.844 "num_base_bdevs": 4, 00:25:58.844 "num_base_bdevs_discovered": 3, 00:25:58.844 "num_base_bdevs_operational": 3, 00:25:58.844 "process": { 00:25:58.844 "type": "rebuild", 00:25:58.844 "target": "spare", 00:25:58.844 "progress": { 00:25:58.844 "blocks": 36864, 00:25:58.844 "percent": 58 00:25:58.844 } 00:25:58.844 }, 00:25:58.844 "base_bdevs_list": [ 00:25:58.844 { 00:25:58.844 "name": "spare", 00:25:58.844 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:25:58.844 "is_configured": true, 00:25:58.844 "data_offset": 2048, 00:25:58.844 "data_size": 63488 00:25:58.844 }, 00:25:58.844 { 00:25:58.844 "name": null, 00:25:58.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.844 "is_configured": false, 00:25:58.844 "data_offset": 2048, 00:25:58.844 "data_size": 63488 00:25:58.844 }, 00:25:58.844 { 00:25:58.844 "name": "BaseBdev3", 00:25:58.844 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:58.844 "is_configured": true, 00:25:58.844 "data_offset": 2048, 00:25:58.844 "data_size": 63488 00:25:58.844 }, 00:25:58.844 { 00:25:58.844 "name": "BaseBdev4", 00:25:58.844 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:58.844 "is_configured": true, 00:25:58.844 "data_offset": 2048, 00:25:58.844 "data_size": 63488 00:25:58.844 } 00:25:58.844 ] 00:25:58.844 }' 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=893 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.844 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.102 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.102 "name": "raid_bdev1", 00:25:59.102 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:25:59.102 "strip_size_kb": 0, 00:25:59.102 "state": "online", 00:25:59.102 "raid_level": "raid1", 00:25:59.102 "superblock": true, 00:25:59.102 "num_base_bdevs": 4, 00:25:59.102 "num_base_bdevs_discovered": 3, 00:25:59.102 "num_base_bdevs_operational": 3, 00:25:59.102 "process": { 00:25:59.102 "type": "rebuild", 00:25:59.102 "target": "spare", 00:25:59.102 "progress": { 00:25:59.102 "blocks": 43008, 00:25:59.102 "percent": 67 00:25:59.102 } 00:25:59.102 }, 00:25:59.102 "base_bdevs_list": [ 00:25:59.102 { 00:25:59.102 "name": "spare", 00:25:59.102 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:25:59.102 "is_configured": true, 00:25:59.102 "data_offset": 2048, 00:25:59.102 "data_size": 63488 00:25:59.102 }, 00:25:59.102 { 00:25:59.102 "name": null, 00:25:59.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.102 "is_configured": false, 00:25:59.102 "data_offset": 2048, 00:25:59.102 "data_size": 63488 00:25:59.102 }, 00:25:59.102 { 00:25:59.102 "name": "BaseBdev3", 00:25:59.102 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:25:59.102 "is_configured": true, 00:25:59.102 "data_offset": 2048, 00:25:59.102 "data_size": 63488 00:25:59.102 }, 00:25:59.102 { 00:25:59.102 "name": "BaseBdev4", 00:25:59.102 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:25:59.102 "is_configured": true, 00:25:59.102 "data_offset": 2048, 00:25:59.102 "data_size": 63488 00:25:59.102 } 00:25:59.102 ] 00:25:59.102 }' 00:25:59.102 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.102 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:59.102 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.359 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:59.359 22:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:00.291 [2024-07-12 22:33:10.264135] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:00.291 [2024-07-12 22:33:10.264201] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:00.291 [2024-07-12 22:33:10.264296] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.291 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.548 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.548 "name": "raid_bdev1", 00:26:00.548 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:00.549 "strip_size_kb": 0, 00:26:00.549 "state": "online", 00:26:00.549 "raid_level": "raid1", 00:26:00.549 "superblock": true, 00:26:00.549 "num_base_bdevs": 4, 00:26:00.549 "num_base_bdevs_discovered": 3, 00:26:00.549 "num_base_bdevs_operational": 3, 00:26:00.549 "base_bdevs_list": [ 00:26:00.549 { 00:26:00.549 "name": "spare", 00:26:00.549 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:00.549 "is_configured": true, 00:26:00.549 "data_offset": 2048, 00:26:00.549 "data_size": 63488 00:26:00.549 }, 00:26:00.549 { 00:26:00.549 "name": null, 00:26:00.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.549 "is_configured": false, 00:26:00.549 "data_offset": 2048, 00:26:00.549 "data_size": 63488 00:26:00.549 }, 00:26:00.549 { 00:26:00.549 "name": "BaseBdev3", 00:26:00.549 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:00.549 "is_configured": true, 00:26:00.549 "data_offset": 2048, 00:26:00.549 "data_size": 63488 00:26:00.549 }, 00:26:00.549 { 00:26:00.549 "name": "BaseBdev4", 00:26:00.549 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:00.549 "is_configured": true, 00:26:00.549 "data_offset": 2048, 00:26:00.549 "data_size": 63488 00:26:00.549 } 00:26:00.549 ] 00:26:00.549 }' 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.549 22:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.806 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.807 "name": "raid_bdev1", 00:26:00.807 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:00.807 "strip_size_kb": 0, 00:26:00.807 "state": "online", 00:26:00.807 "raid_level": "raid1", 00:26:00.807 "superblock": true, 00:26:00.807 "num_base_bdevs": 4, 00:26:00.807 "num_base_bdevs_discovered": 3, 00:26:00.807 "num_base_bdevs_operational": 3, 00:26:00.807 "base_bdevs_list": [ 00:26:00.807 { 00:26:00.807 "name": "spare", 00:26:00.807 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:00.807 "is_configured": true, 00:26:00.807 "data_offset": 2048, 00:26:00.807 "data_size": 63488 00:26:00.807 }, 00:26:00.807 { 00:26:00.807 "name": null, 00:26:00.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.807 "is_configured": false, 00:26:00.807 "data_offset": 2048, 00:26:00.807 "data_size": 63488 00:26:00.807 }, 00:26:00.807 { 00:26:00.807 "name": "BaseBdev3", 00:26:00.807 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:00.807 "is_configured": true, 00:26:00.807 "data_offset": 2048, 00:26:00.807 "data_size": 63488 00:26:00.807 }, 00:26:00.807 { 00:26:00.807 "name": "BaseBdev4", 00:26:00.807 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:00.807 "is_configured": true, 00:26:00.807 "data_offset": 2048, 00:26:00.807 "data_size": 63488 00:26:00.807 } 00:26:00.807 ] 00:26:00.807 }' 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.807 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.064 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.064 "name": "raid_bdev1", 00:26:01.064 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:01.064 "strip_size_kb": 0, 00:26:01.064 "state": "online", 00:26:01.064 "raid_level": "raid1", 00:26:01.064 "superblock": true, 00:26:01.064 "num_base_bdevs": 4, 00:26:01.064 "num_base_bdevs_discovered": 3, 00:26:01.064 "num_base_bdevs_operational": 3, 00:26:01.064 "base_bdevs_list": [ 00:26:01.064 { 00:26:01.064 "name": "spare", 00:26:01.064 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:01.064 "is_configured": true, 00:26:01.064 "data_offset": 2048, 00:26:01.065 "data_size": 63488 00:26:01.065 }, 00:26:01.065 { 00:26:01.065 "name": null, 00:26:01.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.065 "is_configured": false, 00:26:01.065 "data_offset": 2048, 00:26:01.065 "data_size": 63488 00:26:01.065 }, 00:26:01.065 { 00:26:01.065 "name": "BaseBdev3", 00:26:01.065 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:01.065 "is_configured": true, 00:26:01.065 "data_offset": 2048, 00:26:01.065 "data_size": 63488 00:26:01.065 }, 00:26:01.065 { 00:26:01.065 "name": "BaseBdev4", 00:26:01.065 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:01.065 "is_configured": true, 00:26:01.065 "data_offset": 2048, 00:26:01.065 "data_size": 63488 00:26:01.065 } 00:26:01.065 ] 00:26:01.065 }' 00:26:01.065 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.065 22:33:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:01.997 22:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:01.997 [2024-07-12 22:33:12.181588] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:01.997 [2024-07-12 22:33:12.181615] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:01.997 [2024-07-12 22:33:12.181675] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:01.997 [2024-07-12 22:33:12.181749] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:01.997 [2024-07-12 22:33:12.181761] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15288a0 name raid_bdev1, state offline 00:26:01.997 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.997 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:02.254 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:02.512 /dev/nbd0 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:02.512 1+0 records in 00:26:02.512 1+0 records out 00:26:02.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226477 s, 18.1 MB/s 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:02.512 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:02.769 /dev/nbd1 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:02.769 22:33:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:02.769 1+0 records in 00:26:02.769 1+0 records out 00:26:02.769 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262369 s, 15.6 MB/s 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.769 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:03.025 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:03.282 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:03.540 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:03.799 22:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:03.799 [2024-07-12 22:33:14.103144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:03.799 [2024-07-12 22:33:14.103191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:03.799 [2024-07-12 22:33:14.103214] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x152d930 00:26:03.799 [2024-07-12 22:33:14.103227] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:03.799 [2024-07-12 22:33:14.104853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:03.799 [2024-07-12 22:33:14.104880] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:03.799 [2024-07-12 22:33:14.104965] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:03.799 [2024-07-12 22:33:14.104993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:03.799 [2024-07-12 22:33:14.105099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:03.799 [2024-07-12 22:33:14.105172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:03.799 spare 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.058 [2024-07-12 22:33:14.205491] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1529b10 00:26:04.058 [2024-07-12 22:33:14.205506] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:04.058 [2024-07-12 22:33:14.205717] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152d820 00:26:04.058 [2024-07-12 22:33:14.205869] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1529b10 00:26:04.058 [2024-07-12 22:33:14.205879] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1529b10 00:26:04.058 [2024-07-12 22:33:14.205991] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.058 "name": "raid_bdev1", 00:26:04.058 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:04.058 "strip_size_kb": 0, 00:26:04.058 "state": "online", 00:26:04.058 "raid_level": "raid1", 00:26:04.058 "superblock": true, 00:26:04.058 "num_base_bdevs": 4, 00:26:04.058 "num_base_bdevs_discovered": 3, 00:26:04.058 "num_base_bdevs_operational": 3, 00:26:04.058 "base_bdevs_list": [ 00:26:04.058 { 00:26:04.058 "name": "spare", 00:26:04.058 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:04.058 "is_configured": true, 00:26:04.058 "data_offset": 2048, 00:26:04.058 "data_size": 63488 00:26:04.058 }, 00:26:04.058 { 00:26:04.058 "name": null, 00:26:04.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.058 "is_configured": false, 00:26:04.058 "data_offset": 2048, 00:26:04.058 "data_size": 63488 00:26:04.058 }, 00:26:04.058 { 00:26:04.058 "name": "BaseBdev3", 00:26:04.058 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:04.058 "is_configured": true, 00:26:04.058 "data_offset": 2048, 00:26:04.058 "data_size": 63488 00:26:04.058 }, 00:26:04.058 { 00:26:04.058 "name": "BaseBdev4", 00:26:04.058 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:04.058 "is_configured": true, 00:26:04.058 "data_offset": 2048, 00:26:04.058 "data_size": 63488 00:26:04.058 } 00:26:04.058 ] 00:26:04.058 }' 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.058 22:33:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.675 22:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.935 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.935 "name": "raid_bdev1", 00:26:04.935 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:04.935 "strip_size_kb": 0, 00:26:04.935 "state": "online", 00:26:04.935 "raid_level": "raid1", 00:26:04.935 "superblock": true, 00:26:04.935 "num_base_bdevs": 4, 00:26:04.935 "num_base_bdevs_discovered": 3, 00:26:04.935 "num_base_bdevs_operational": 3, 00:26:04.935 "base_bdevs_list": [ 00:26:04.935 { 00:26:04.935 "name": "spare", 00:26:04.935 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:04.935 "is_configured": true, 00:26:04.935 "data_offset": 2048, 00:26:04.936 "data_size": 63488 00:26:04.936 }, 00:26:04.936 { 00:26:04.936 "name": null, 00:26:04.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.936 "is_configured": false, 00:26:04.936 "data_offset": 2048, 00:26:04.936 "data_size": 63488 00:26:04.936 }, 00:26:04.936 { 00:26:04.936 "name": "BaseBdev3", 00:26:04.936 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:04.936 "is_configured": true, 00:26:04.936 "data_offset": 2048, 00:26:04.936 "data_size": 63488 00:26:04.936 }, 00:26:04.936 { 00:26:04.936 "name": "BaseBdev4", 00:26:04.936 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:04.936 "is_configured": true, 00:26:04.936 "data_offset": 2048, 00:26:04.936 "data_size": 63488 00:26:04.936 } 00:26:04.936 ] 00:26:04.936 }' 00:26:04.936 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:05.194 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:05.194 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:05.194 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:05.194 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.194 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:05.453 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:05.453 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:05.453 [2024-07-12 22:33:15.775690] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.712 "name": "raid_bdev1", 00:26:05.712 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:05.712 "strip_size_kb": 0, 00:26:05.712 "state": "online", 00:26:05.712 "raid_level": "raid1", 00:26:05.712 "superblock": true, 00:26:05.712 "num_base_bdevs": 4, 00:26:05.712 "num_base_bdevs_discovered": 2, 00:26:05.712 "num_base_bdevs_operational": 2, 00:26:05.712 "base_bdevs_list": [ 00:26:05.712 { 00:26:05.712 "name": null, 00:26:05.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.712 "is_configured": false, 00:26:05.712 "data_offset": 2048, 00:26:05.712 "data_size": 63488 00:26:05.712 }, 00:26:05.712 { 00:26:05.712 "name": null, 00:26:05.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.712 "is_configured": false, 00:26:05.712 "data_offset": 2048, 00:26:05.712 "data_size": 63488 00:26:05.712 }, 00:26:05.712 { 00:26:05.712 "name": "BaseBdev3", 00:26:05.712 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:05.712 "is_configured": true, 00:26:05.712 "data_offset": 2048, 00:26:05.712 "data_size": 63488 00:26:05.712 }, 00:26:05.712 { 00:26:05.712 "name": "BaseBdev4", 00:26:05.712 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:05.712 "is_configured": true, 00:26:05.712 "data_offset": 2048, 00:26:05.712 "data_size": 63488 00:26:05.712 } 00:26:05.712 ] 00:26:05.712 }' 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.712 22:33:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:06.280 22:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:06.539 [2024-07-12 22:33:16.714225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:06.539 [2024-07-12 22:33:16.714373] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:06.539 [2024-07-12 22:33:16.714389] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:06.539 [2024-07-12 22:33:16.714416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:06.539 [2024-07-12 22:33:16.718375] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13eeeb0 00:26:06.539 [2024-07-12 22:33:16.720741] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:06.539 22:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.474 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.734 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.734 "name": "raid_bdev1", 00:26:07.734 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:07.734 "strip_size_kb": 0, 00:26:07.734 "state": "online", 00:26:07.734 "raid_level": "raid1", 00:26:07.734 "superblock": true, 00:26:07.734 "num_base_bdevs": 4, 00:26:07.734 "num_base_bdevs_discovered": 3, 00:26:07.734 "num_base_bdevs_operational": 3, 00:26:07.734 "process": { 00:26:07.734 "type": "rebuild", 00:26:07.734 "target": "spare", 00:26:07.734 "progress": { 00:26:07.734 "blocks": 24576, 00:26:07.734 "percent": 38 00:26:07.734 } 00:26:07.734 }, 00:26:07.734 "base_bdevs_list": [ 00:26:07.734 { 00:26:07.734 "name": "spare", 00:26:07.734 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:07.734 "is_configured": true, 00:26:07.734 "data_offset": 2048, 00:26:07.734 "data_size": 63488 00:26:07.734 }, 00:26:07.734 { 00:26:07.734 "name": null, 00:26:07.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.734 "is_configured": false, 00:26:07.734 "data_offset": 2048, 00:26:07.734 "data_size": 63488 00:26:07.734 }, 00:26:07.734 { 00:26:07.734 "name": "BaseBdev3", 00:26:07.734 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:07.734 "is_configured": true, 00:26:07.734 "data_offset": 2048, 00:26:07.734 "data_size": 63488 00:26:07.734 }, 00:26:07.734 { 00:26:07.734 "name": "BaseBdev4", 00:26:07.734 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:07.734 "is_configured": true, 00:26:07.734 "data_offset": 2048, 00:26:07.734 "data_size": 63488 00:26:07.734 } 00:26:07.734 ] 00:26:07.734 }' 00:26:07.734 22:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.734 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:07.734 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.993 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:07.993 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:07.993 [2024-07-12 22:33:18.241129] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:08.253 [2024-07-12 22:33:18.333344] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:08.253 [2024-07-12 22:33:18.333390] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:08.253 [2024-07-12 22:33:18.333407] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:08.253 [2024-07-12 22:33:18.333415] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.253 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.512 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.512 "name": "raid_bdev1", 00:26:08.512 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:08.512 "strip_size_kb": 0, 00:26:08.512 "state": "online", 00:26:08.512 "raid_level": "raid1", 00:26:08.512 "superblock": true, 00:26:08.512 "num_base_bdevs": 4, 00:26:08.512 "num_base_bdevs_discovered": 2, 00:26:08.512 "num_base_bdevs_operational": 2, 00:26:08.512 "base_bdevs_list": [ 00:26:08.512 { 00:26:08.512 "name": null, 00:26:08.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.512 "is_configured": false, 00:26:08.512 "data_offset": 2048, 00:26:08.512 "data_size": 63488 00:26:08.512 }, 00:26:08.512 { 00:26:08.512 "name": null, 00:26:08.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.512 "is_configured": false, 00:26:08.512 "data_offset": 2048, 00:26:08.512 "data_size": 63488 00:26:08.512 }, 00:26:08.512 { 00:26:08.512 "name": "BaseBdev3", 00:26:08.512 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:08.512 "is_configured": true, 00:26:08.512 "data_offset": 2048, 00:26:08.512 "data_size": 63488 00:26:08.512 }, 00:26:08.512 { 00:26:08.512 "name": "BaseBdev4", 00:26:08.513 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:08.513 "is_configured": true, 00:26:08.513 "data_offset": 2048, 00:26:08.513 "data_size": 63488 00:26:08.513 } 00:26:08.513 ] 00:26:08.513 }' 00:26:08.513 22:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.513 22:33:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:09.082 22:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:09.082 [2024-07-12 22:33:19.368342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:09.082 [2024-07-12 22:33:19.368389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:09.082 [2024-07-12 22:33:19.368412] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x159d0a0 00:26:09.082 [2024-07-12 22:33:19.368424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:09.082 [2024-07-12 22:33:19.368793] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:09.082 [2024-07-12 22:33:19.368812] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:09.082 [2024-07-12 22:33:19.368892] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:09.082 [2024-07-12 22:33:19.368905] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:09.082 [2024-07-12 22:33:19.368916] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:09.082 [2024-07-12 22:33:19.368944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:09.082 [2024-07-12 22:33:19.372910] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13eeeb0 00:26:09.082 spare 00:26:09.082 [2024-07-12 22:33:19.374313] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:09.082 22:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.460 "name": "raid_bdev1", 00:26:10.460 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:10.460 "strip_size_kb": 0, 00:26:10.460 "state": "online", 00:26:10.460 "raid_level": "raid1", 00:26:10.460 "superblock": true, 00:26:10.460 "num_base_bdevs": 4, 00:26:10.460 "num_base_bdevs_discovered": 3, 00:26:10.460 "num_base_bdevs_operational": 3, 00:26:10.460 "process": { 00:26:10.460 "type": "rebuild", 00:26:10.460 "target": "spare", 00:26:10.460 "progress": { 00:26:10.460 "blocks": 24576, 00:26:10.460 "percent": 38 00:26:10.460 } 00:26:10.460 }, 00:26:10.460 "base_bdevs_list": [ 00:26:10.460 { 00:26:10.460 "name": "spare", 00:26:10.460 "uuid": "20bcc79f-bb63-5bf6-bcd9-207a679689be", 00:26:10.460 "is_configured": true, 00:26:10.460 "data_offset": 2048, 00:26:10.460 "data_size": 63488 00:26:10.460 }, 00:26:10.460 { 00:26:10.460 "name": null, 00:26:10.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.460 "is_configured": false, 00:26:10.460 "data_offset": 2048, 00:26:10.460 "data_size": 63488 00:26:10.460 }, 00:26:10.460 { 00:26:10.460 "name": "BaseBdev3", 00:26:10.460 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:10.460 "is_configured": true, 00:26:10.460 "data_offset": 2048, 00:26:10.460 "data_size": 63488 00:26:10.460 }, 00:26:10.460 { 00:26:10.460 "name": "BaseBdev4", 00:26:10.460 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:10.460 "is_configured": true, 00:26:10.460 "data_offset": 2048, 00:26:10.460 "data_size": 63488 00:26:10.460 } 00:26:10.460 ] 00:26:10.460 }' 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.460 22:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:10.719 [2024-07-12 22:33:20.962396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:10.719 [2024-07-12 22:33:20.986839] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:10.719 [2024-07-12 22:33:20.986880] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:10.719 [2024-07-12 22:33:20.986897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:10.719 [2024-07-12 22:33:20.986905] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.719 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.720 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.720 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.720 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.720 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.979 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.979 "name": "raid_bdev1", 00:26:10.979 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:10.979 "strip_size_kb": 0, 00:26:10.979 "state": "online", 00:26:10.979 "raid_level": "raid1", 00:26:10.979 "superblock": true, 00:26:10.979 "num_base_bdevs": 4, 00:26:10.979 "num_base_bdevs_discovered": 2, 00:26:10.979 "num_base_bdevs_operational": 2, 00:26:10.979 "base_bdevs_list": [ 00:26:10.979 { 00:26:10.979 "name": null, 00:26:10.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.979 "is_configured": false, 00:26:10.979 "data_offset": 2048, 00:26:10.979 "data_size": 63488 00:26:10.979 }, 00:26:10.979 { 00:26:10.979 "name": null, 00:26:10.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.979 "is_configured": false, 00:26:10.979 "data_offset": 2048, 00:26:10.979 "data_size": 63488 00:26:10.979 }, 00:26:10.979 { 00:26:10.979 "name": "BaseBdev3", 00:26:10.979 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:10.979 "is_configured": true, 00:26:10.979 "data_offset": 2048, 00:26:10.979 "data_size": 63488 00:26:10.979 }, 00:26:10.979 { 00:26:10.979 "name": "BaseBdev4", 00:26:10.979 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:10.979 "is_configured": true, 00:26:10.979 "data_offset": 2048, 00:26:10.979 "data_size": 63488 00:26:10.979 } 00:26:10.979 ] 00:26:10.979 }' 00:26:10.979 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.979 22:33:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:11.548 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:11.548 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.548 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:11.548 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:11.548 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.807 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.807 22:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.807 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.807 "name": "raid_bdev1", 00:26:11.807 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:11.807 "strip_size_kb": 0, 00:26:11.807 "state": "online", 00:26:11.807 "raid_level": "raid1", 00:26:11.807 "superblock": true, 00:26:11.807 "num_base_bdevs": 4, 00:26:11.807 "num_base_bdevs_discovered": 2, 00:26:11.807 "num_base_bdevs_operational": 2, 00:26:11.807 "base_bdevs_list": [ 00:26:11.807 { 00:26:11.807 "name": null, 00:26:11.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.807 "is_configured": false, 00:26:11.807 "data_offset": 2048, 00:26:11.807 "data_size": 63488 00:26:11.807 }, 00:26:11.807 { 00:26:11.807 "name": null, 00:26:11.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.807 "is_configured": false, 00:26:11.807 "data_offset": 2048, 00:26:11.807 "data_size": 63488 00:26:11.807 }, 00:26:11.807 { 00:26:11.807 "name": "BaseBdev3", 00:26:11.807 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:11.807 "is_configured": true, 00:26:11.807 "data_offset": 2048, 00:26:11.807 "data_size": 63488 00:26:11.807 }, 00:26:11.807 { 00:26:11.807 "name": "BaseBdev4", 00:26:11.807 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:11.807 "is_configured": true, 00:26:11.807 "data_offset": 2048, 00:26:11.807 "data_size": 63488 00:26:11.807 } 00:26:11.807 ] 00:26:11.807 }' 00:26:11.807 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:12.066 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:12.066 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:12.066 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:12.066 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:12.326 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:12.585 [2024-07-12 22:33:22.667339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:12.585 [2024-07-12 22:33:22.667382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.585 [2024-07-12 22:33:22.667402] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ee690 00:26:12.585 [2024-07-12 22:33:22.667415] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.585 [2024-07-12 22:33:22.667757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.585 [2024-07-12 22:33:22.667774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:12.585 [2024-07-12 22:33:22.667835] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:12.585 [2024-07-12 22:33:22.667847] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:12.585 [2024-07-12 22:33:22.667858] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:12.585 BaseBdev1 00:26:12.585 22:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.521 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.781 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.781 "name": "raid_bdev1", 00:26:13.781 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:13.781 "strip_size_kb": 0, 00:26:13.781 "state": "online", 00:26:13.781 "raid_level": "raid1", 00:26:13.781 "superblock": true, 00:26:13.781 "num_base_bdevs": 4, 00:26:13.781 "num_base_bdevs_discovered": 2, 00:26:13.781 "num_base_bdevs_operational": 2, 00:26:13.781 "base_bdevs_list": [ 00:26:13.781 { 00:26:13.781 "name": null, 00:26:13.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.781 "is_configured": false, 00:26:13.781 "data_offset": 2048, 00:26:13.781 "data_size": 63488 00:26:13.781 }, 00:26:13.781 { 00:26:13.781 "name": null, 00:26:13.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.781 "is_configured": false, 00:26:13.781 "data_offset": 2048, 00:26:13.781 "data_size": 63488 00:26:13.781 }, 00:26:13.781 { 00:26:13.781 "name": "BaseBdev3", 00:26:13.781 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:13.781 "is_configured": true, 00:26:13.781 "data_offset": 2048, 00:26:13.781 "data_size": 63488 00:26:13.781 }, 00:26:13.781 { 00:26:13.781 "name": "BaseBdev4", 00:26:13.781 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:13.781 "is_configured": true, 00:26:13.781 "data_offset": 2048, 00:26:13.781 "data_size": 63488 00:26:13.781 } 00:26:13.781 ] 00:26:13.781 }' 00:26:13.781 22:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.781 22:33:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.348 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.608 "name": "raid_bdev1", 00:26:14.608 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:14.608 "strip_size_kb": 0, 00:26:14.608 "state": "online", 00:26:14.608 "raid_level": "raid1", 00:26:14.608 "superblock": true, 00:26:14.608 "num_base_bdevs": 4, 00:26:14.608 "num_base_bdevs_discovered": 2, 00:26:14.608 "num_base_bdevs_operational": 2, 00:26:14.608 "base_bdevs_list": [ 00:26:14.608 { 00:26:14.608 "name": null, 00:26:14.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.608 "is_configured": false, 00:26:14.608 "data_offset": 2048, 00:26:14.608 "data_size": 63488 00:26:14.608 }, 00:26:14.608 { 00:26:14.608 "name": null, 00:26:14.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.608 "is_configured": false, 00:26:14.608 "data_offset": 2048, 00:26:14.608 "data_size": 63488 00:26:14.608 }, 00:26:14.608 { 00:26:14.608 "name": "BaseBdev3", 00:26:14.608 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:14.608 "is_configured": true, 00:26:14.608 "data_offset": 2048, 00:26:14.608 "data_size": 63488 00:26:14.608 }, 00:26:14.608 { 00:26:14.608 "name": "BaseBdev4", 00:26:14.608 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:14.608 "is_configured": true, 00:26:14.608 "data_offset": 2048, 00:26:14.608 "data_size": 63488 00:26:14.608 } 00:26:14.608 ] 00:26:14.608 }' 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:14.608 22:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:14.867 [2024-07-12 22:33:25.089846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:14.868 [2024-07-12 22:33:25.089983] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:14.868 [2024-07-12 22:33:25.089998] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:14.868 request: 00:26:14.868 { 00:26:14.868 "base_bdev": "BaseBdev1", 00:26:14.868 "raid_bdev": "raid_bdev1", 00:26:14.868 "method": "bdev_raid_add_base_bdev", 00:26:14.868 "req_id": 1 00:26:14.868 } 00:26:14.868 Got JSON-RPC error response 00:26:14.868 response: 00:26:14.868 { 00:26:14.868 "code": -22, 00:26:14.868 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:14.868 } 00:26:14.868 22:33:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:26:14.868 22:33:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:14.868 22:33:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:14.868 22:33:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:14.868 22:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.803 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.062 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.062 "name": "raid_bdev1", 00:26:16.062 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:16.062 "strip_size_kb": 0, 00:26:16.062 "state": "online", 00:26:16.062 "raid_level": "raid1", 00:26:16.062 "superblock": true, 00:26:16.062 "num_base_bdevs": 4, 00:26:16.062 "num_base_bdevs_discovered": 2, 00:26:16.062 "num_base_bdevs_operational": 2, 00:26:16.062 "base_bdevs_list": [ 00:26:16.062 { 00:26:16.062 "name": null, 00:26:16.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.062 "is_configured": false, 00:26:16.062 "data_offset": 2048, 00:26:16.062 "data_size": 63488 00:26:16.062 }, 00:26:16.062 { 00:26:16.062 "name": null, 00:26:16.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.062 "is_configured": false, 00:26:16.062 "data_offset": 2048, 00:26:16.062 "data_size": 63488 00:26:16.062 }, 00:26:16.062 { 00:26:16.062 "name": "BaseBdev3", 00:26:16.062 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:16.062 "is_configured": true, 00:26:16.062 "data_offset": 2048, 00:26:16.062 "data_size": 63488 00:26:16.062 }, 00:26:16.062 { 00:26:16.062 "name": "BaseBdev4", 00:26:16.062 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:16.062 "is_configured": true, 00:26:16.062 "data_offset": 2048, 00:26:16.062 "data_size": 63488 00:26:16.062 } 00:26:16.062 ] 00:26:16.062 }' 00:26:16.062 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.062 22:33:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.998 22:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:16.998 "name": "raid_bdev1", 00:26:16.998 "uuid": "2c26c1cf-b1b8-43dc-ba55-38026897da91", 00:26:16.998 "strip_size_kb": 0, 00:26:16.998 "state": "online", 00:26:16.998 "raid_level": "raid1", 00:26:16.998 "superblock": true, 00:26:16.998 "num_base_bdevs": 4, 00:26:16.998 "num_base_bdevs_discovered": 2, 00:26:16.998 "num_base_bdevs_operational": 2, 00:26:16.998 "base_bdevs_list": [ 00:26:16.998 { 00:26:16.998 "name": null, 00:26:16.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.998 "is_configured": false, 00:26:16.998 "data_offset": 2048, 00:26:16.998 "data_size": 63488 00:26:16.998 }, 00:26:16.998 { 00:26:16.998 "name": null, 00:26:16.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.998 "is_configured": false, 00:26:16.998 "data_offset": 2048, 00:26:16.998 "data_size": 63488 00:26:16.998 }, 00:26:16.998 { 00:26:16.998 "name": "BaseBdev3", 00:26:16.998 "uuid": "384cd503-936a-5ad0-be92-56531ad92480", 00:26:16.998 "is_configured": true, 00:26:16.998 "data_offset": 2048, 00:26:16.998 "data_size": 63488 00:26:16.998 }, 00:26:16.998 { 00:26:16.998 "name": "BaseBdev4", 00:26:16.998 "uuid": "f5dbc3ad-a049-5101-bd48-37d770cfe6c6", 00:26:16.998 "is_configured": true, 00:26:16.998 "data_offset": 2048, 00:26:16.998 "data_size": 63488 00:26:16.998 } 00:26:16.998 ] 00:26:16.998 }' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 3546395 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 3546395 ']' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 3546395 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3546395 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3546395' 00:26:16.998 killing process with pid 3546395 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 3546395 00:26:16.998 Received shutdown signal, test time was about 60.000000 seconds 00:26:16.998 00:26:16.998 Latency(us) 00:26:16.998 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:16.998 =================================================================================================================== 00:26:16.998 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:16.998 [2024-07-12 22:33:27.304798] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:16.998 [2024-07-12 22:33:27.304894] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:16.998 [2024-07-12 22:33:27.304962] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:16.998 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 3546395 00:26:16.999 [2024-07-12 22:33:27.304977] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1529b10 name raid_bdev1, state offline 00:26:17.258 [2024-07-12 22:33:27.353811] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:17.258 22:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:26:17.258 00:26:17.258 real 0m37.141s 00:26:17.258 user 0m54.836s 00:26:17.258 sys 0m6.456s 00:26:17.258 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:17.258 22:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:17.258 ************************************ 00:26:17.258 END TEST raid_rebuild_test_sb 00:26:17.258 ************************************ 00:26:17.517 22:33:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:17.517 22:33:27 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:17.517 22:33:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:17.517 22:33:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:17.517 22:33:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:17.517 ************************************ 00:26:17.517 START TEST raid_rebuild_test_io 00:26:17.517 ************************************ 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:17.517 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3552110 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3552110 /var/tmp/spdk-raid.sock 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 3552110 ']' 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:17.518 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:17.518 22:33:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:17.518 [2024-07-12 22:33:27.718667] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:26:17.518 [2024-07-12 22:33:27.718729] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3552110 ] 00:26:17.518 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:17.518 Zero copy mechanism will not be used. 00:26:17.777 [2024-07-12 22:33:27.846455] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:17.777 [2024-07-12 22:33:27.949350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:17.777 [2024-07-12 22:33:28.013667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:17.777 [2024-07-12 22:33:28.013708] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:18.345 22:33:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:18.345 22:33:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:26:18.345 22:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:18.345 22:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:18.603 BaseBdev1_malloc 00:26:18.603 22:33:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:18.862 [2024-07-12 22:33:29.134002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:18.862 [2024-07-12 22:33:29.134049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:18.862 [2024-07-12 22:33:29.134075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246bd40 00:26:18.862 [2024-07-12 22:33:29.134087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:18.862 [2024-07-12 22:33:29.135835] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:18.862 [2024-07-12 22:33:29.135863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:18.862 BaseBdev1 00:26:18.862 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:18.862 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:19.122 BaseBdev2_malloc 00:26:19.122 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:19.381 [2024-07-12 22:33:29.629365] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:19.381 [2024-07-12 22:33:29.629410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.381 [2024-07-12 22:33:29.629434] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246c860 00:26:19.381 [2024-07-12 22:33:29.629446] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.381 [2024-07-12 22:33:29.631011] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.381 [2024-07-12 22:33:29.631039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:19.381 BaseBdev2 00:26:19.381 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:19.381 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:19.641 BaseBdev3_malloc 00:26:19.641 22:33:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:19.900 [2024-07-12 22:33:30.123267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:19.900 [2024-07-12 22:33:30.123314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.900 [2024-07-12 22:33:30.123335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26198f0 00:26:19.900 [2024-07-12 22:33:30.123347] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.900 [2024-07-12 22:33:30.124917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.900 [2024-07-12 22:33:30.124950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:19.900 BaseBdev3 00:26:19.900 22:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:19.900 22:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:20.159 BaseBdev4_malloc 00:26:20.159 22:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:20.417 [2024-07-12 22:33:30.606371] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:20.417 [2024-07-12 22:33:30.606416] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.417 [2024-07-12 22:33:30.606437] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2618ad0 00:26:20.417 [2024-07-12 22:33:30.606457] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.417 [2024-07-12 22:33:30.608023] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.417 [2024-07-12 22:33:30.608051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:20.417 BaseBdev4 00:26:20.418 22:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:20.676 spare_malloc 00:26:20.676 22:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:20.935 spare_delay 00:26:20.935 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:21.195 [2024-07-12 22:33:31.350154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:21.195 [2024-07-12 22:33:31.350198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:21.195 [2024-07-12 22:33:31.350218] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261d5b0 00:26:21.195 [2024-07-12 22:33:31.350230] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:21.195 [2024-07-12 22:33:31.351802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:21.195 [2024-07-12 22:33:31.351830] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:21.195 spare 00:26:21.195 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:21.454 [2024-07-12 22:33:31.582789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:21.454 [2024-07-12 22:33:31.584081] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:21.454 [2024-07-12 22:33:31.584136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:21.454 [2024-07-12 22:33:31.584182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:21.454 [2024-07-12 22:33:31.584264] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x259c8a0 00:26:21.454 [2024-07-12 22:33:31.584275] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:21.454 [2024-07-12 22:33:31.584490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2616e10 00:26:21.454 [2024-07-12 22:33:31.584640] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x259c8a0 00:26:21.454 [2024-07-12 22:33:31.584650] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x259c8a0 00:26:21.454 [2024-07-12 22:33:31.584762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.454 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.714 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.714 "name": "raid_bdev1", 00:26:21.714 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:21.714 "strip_size_kb": 0, 00:26:21.714 "state": "online", 00:26:21.714 "raid_level": "raid1", 00:26:21.714 "superblock": false, 00:26:21.714 "num_base_bdevs": 4, 00:26:21.714 "num_base_bdevs_discovered": 4, 00:26:21.714 "num_base_bdevs_operational": 4, 00:26:21.714 "base_bdevs_list": [ 00:26:21.714 { 00:26:21.714 "name": "BaseBdev1", 00:26:21.714 "uuid": "093d3b40-3fcf-5c77-9c24-f632bbc150f1", 00:26:21.714 "is_configured": true, 00:26:21.714 "data_offset": 0, 00:26:21.714 "data_size": 65536 00:26:21.714 }, 00:26:21.714 { 00:26:21.714 "name": "BaseBdev2", 00:26:21.714 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:21.714 "is_configured": true, 00:26:21.714 "data_offset": 0, 00:26:21.714 "data_size": 65536 00:26:21.714 }, 00:26:21.714 { 00:26:21.714 "name": "BaseBdev3", 00:26:21.714 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:21.714 "is_configured": true, 00:26:21.714 "data_offset": 0, 00:26:21.714 "data_size": 65536 00:26:21.714 }, 00:26:21.714 { 00:26:21.714 "name": "BaseBdev4", 00:26:21.714 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:21.714 "is_configured": true, 00:26:21.714 "data_offset": 0, 00:26:21.714 "data_size": 65536 00:26:21.714 } 00:26:21.714 ] 00:26:21.714 }' 00:26:21.714 22:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.714 22:33:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:22.282 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:22.282 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:22.540 [2024-07-12 22:33:32.674095] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:22.540 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:22.540 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.540 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:22.799 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:22.799 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:22.799 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:22.799 22:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:22.799 [2024-07-12 22:33:33.036885] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a2970 00:26:22.799 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:22.799 Zero copy mechanism will not be used. 00:26:22.799 Running I/O for 60 seconds... 00:26:23.057 [2024-07-12 22:33:33.156105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:23.057 [2024-07-12 22:33:33.172280] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25a2970 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.057 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.315 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:23.315 "name": "raid_bdev1", 00:26:23.315 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:23.315 "strip_size_kb": 0, 00:26:23.315 "state": "online", 00:26:23.315 "raid_level": "raid1", 00:26:23.315 "superblock": false, 00:26:23.315 "num_base_bdevs": 4, 00:26:23.315 "num_base_bdevs_discovered": 3, 00:26:23.315 "num_base_bdevs_operational": 3, 00:26:23.315 "base_bdevs_list": [ 00:26:23.316 { 00:26:23.316 "name": null, 00:26:23.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.316 "is_configured": false, 00:26:23.316 "data_offset": 0, 00:26:23.316 "data_size": 65536 00:26:23.316 }, 00:26:23.316 { 00:26:23.316 "name": "BaseBdev2", 00:26:23.316 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:23.316 "is_configured": true, 00:26:23.316 "data_offset": 0, 00:26:23.316 "data_size": 65536 00:26:23.316 }, 00:26:23.316 { 00:26:23.316 "name": "BaseBdev3", 00:26:23.316 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:23.316 "is_configured": true, 00:26:23.316 "data_offset": 0, 00:26:23.316 "data_size": 65536 00:26:23.316 }, 00:26:23.316 { 00:26:23.316 "name": "BaseBdev4", 00:26:23.316 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:23.316 "is_configured": true, 00:26:23.316 "data_offset": 0, 00:26:23.316 "data_size": 65536 00:26:23.316 } 00:26:23.316 ] 00:26:23.316 }' 00:26:23.316 22:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:23.316 22:33:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:23.948 22:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:24.235 [2024-07-12 22:33:34.305301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:24.235 22:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:24.235 [2024-07-12 22:33:34.351912] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2172fa0 00:26:24.235 [2024-07-12 22:33:34.354339] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:24.235 [2024-07-12 22:33:34.465349] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:24.235 [2024-07-12 22:33:34.465899] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:24.494 [2024-07-12 22:33:34.696558] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:24.494 [2024-07-12 22:33:34.696858] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:24.752 [2024-07-12 22:33:35.042975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:25.011 [2024-07-12 22:33:35.267670] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:25.011 [2024-07-12 22:33:35.267987] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.270 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.528 "name": "raid_bdev1", 00:26:25.528 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:25.528 "strip_size_kb": 0, 00:26:25.528 "state": "online", 00:26:25.528 "raid_level": "raid1", 00:26:25.528 "superblock": false, 00:26:25.528 "num_base_bdevs": 4, 00:26:25.528 "num_base_bdevs_discovered": 4, 00:26:25.528 "num_base_bdevs_operational": 4, 00:26:25.528 "process": { 00:26:25.528 "type": "rebuild", 00:26:25.528 "target": "spare", 00:26:25.528 "progress": { 00:26:25.528 "blocks": 12288, 00:26:25.528 "percent": 18 00:26:25.528 } 00:26:25.528 }, 00:26:25.528 "base_bdevs_list": [ 00:26:25.528 { 00:26:25.528 "name": "spare", 00:26:25.528 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:25.528 "is_configured": true, 00:26:25.528 "data_offset": 0, 00:26:25.528 "data_size": 65536 00:26:25.528 }, 00:26:25.528 { 00:26:25.528 "name": "BaseBdev2", 00:26:25.528 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:25.528 "is_configured": true, 00:26:25.528 "data_offset": 0, 00:26:25.528 "data_size": 65536 00:26:25.528 }, 00:26:25.528 { 00:26:25.528 "name": "BaseBdev3", 00:26:25.528 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:25.528 "is_configured": true, 00:26:25.528 "data_offset": 0, 00:26:25.528 "data_size": 65536 00:26:25.528 }, 00:26:25.528 { 00:26:25.528 "name": "BaseBdev4", 00:26:25.528 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:25.528 "is_configured": true, 00:26:25.528 "data_offset": 0, 00:26:25.528 "data_size": 65536 00:26:25.528 } 00:26:25.528 ] 00:26:25.528 }' 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:25.528 22:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:25.528 [2024-07-12 22:33:35.715420] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:25.786 [2024-07-12 22:33:35.919157] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:25.786 [2024-07-12 22:33:35.959068] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:25.786 [2024-07-12 22:33:36.069375] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:25.786 [2024-07-12 22:33:36.072496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:25.786 [2024-07-12 22:33:36.072528] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:25.786 [2024-07-12 22:33:36.072539] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:25.786 [2024-07-12 22:33:36.096083] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25a2970 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.044 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.302 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.302 "name": "raid_bdev1", 00:26:26.302 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:26.302 "strip_size_kb": 0, 00:26:26.302 "state": "online", 00:26:26.302 "raid_level": "raid1", 00:26:26.302 "superblock": false, 00:26:26.302 "num_base_bdevs": 4, 00:26:26.302 "num_base_bdevs_discovered": 3, 00:26:26.302 "num_base_bdevs_operational": 3, 00:26:26.302 "base_bdevs_list": [ 00:26:26.302 { 00:26:26.302 "name": null, 00:26:26.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.302 "is_configured": false, 00:26:26.302 "data_offset": 0, 00:26:26.302 "data_size": 65536 00:26:26.302 }, 00:26:26.302 { 00:26:26.302 "name": "BaseBdev2", 00:26:26.302 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:26.302 "is_configured": true, 00:26:26.302 "data_offset": 0, 00:26:26.302 "data_size": 65536 00:26:26.302 }, 00:26:26.302 { 00:26:26.302 "name": "BaseBdev3", 00:26:26.302 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:26.302 "is_configured": true, 00:26:26.302 "data_offset": 0, 00:26:26.302 "data_size": 65536 00:26:26.302 }, 00:26:26.302 { 00:26:26.302 "name": "BaseBdev4", 00:26:26.302 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:26.302 "is_configured": true, 00:26:26.302 "data_offset": 0, 00:26:26.302 "data_size": 65536 00:26:26.302 } 00:26:26.302 ] 00:26:26.302 }' 00:26:26.302 22:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.302 22:33:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:26.866 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:26.866 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.866 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:26.866 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:26.867 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.867 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.867 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.139 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.139 "name": "raid_bdev1", 00:26:27.139 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:27.139 "strip_size_kb": 0, 00:26:27.139 "state": "online", 00:26:27.139 "raid_level": "raid1", 00:26:27.139 "superblock": false, 00:26:27.139 "num_base_bdevs": 4, 00:26:27.139 "num_base_bdevs_discovered": 3, 00:26:27.139 "num_base_bdevs_operational": 3, 00:26:27.139 "base_bdevs_list": [ 00:26:27.139 { 00:26:27.139 "name": null, 00:26:27.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.139 "is_configured": false, 00:26:27.139 "data_offset": 0, 00:26:27.139 "data_size": 65536 00:26:27.139 }, 00:26:27.139 { 00:26:27.139 "name": "BaseBdev2", 00:26:27.139 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:27.139 "is_configured": true, 00:26:27.139 "data_offset": 0, 00:26:27.139 "data_size": 65536 00:26:27.139 }, 00:26:27.139 { 00:26:27.139 "name": "BaseBdev3", 00:26:27.139 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:27.139 "is_configured": true, 00:26:27.139 "data_offset": 0, 00:26:27.139 "data_size": 65536 00:26:27.139 }, 00:26:27.139 { 00:26:27.139 "name": "BaseBdev4", 00:26:27.139 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:27.139 "is_configured": true, 00:26:27.139 "data_offset": 0, 00:26:27.140 "data_size": 65536 00:26:27.140 } 00:26:27.140 ] 00:26:27.140 }' 00:26:27.140 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.140 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:27.140 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.140 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:27.140 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:27.403 [2024-07-12 22:33:37.533850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.403 22:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:27.403 [2024-07-12 22:33:37.617311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2611540 00:26:27.403 [2024-07-12 22:33:37.618819] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:27.403 [2024-07-12 22:33:37.720548] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:27.403 [2024-07-12 22:33:37.721004] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:27.661 [2024-07-12 22:33:37.934963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:27.661 [2024-07-12 22:33:37.935598] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:28.227 [2024-07-12 22:33:38.292165] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.486 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.745 [2024-07-12 22:33:38.829096] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:28.745 [2024-07-12 22:33:38.838526] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.745 "name": "raid_bdev1", 00:26:28.745 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:28.745 "strip_size_kb": 0, 00:26:28.745 "state": "online", 00:26:28.745 "raid_level": "raid1", 00:26:28.745 "superblock": false, 00:26:28.745 "num_base_bdevs": 4, 00:26:28.745 "num_base_bdevs_discovered": 4, 00:26:28.745 "num_base_bdevs_operational": 4, 00:26:28.745 "process": { 00:26:28.745 "type": "rebuild", 00:26:28.745 "target": "spare", 00:26:28.745 "progress": { 00:26:28.745 "blocks": 14336, 00:26:28.745 "percent": 21 00:26:28.745 } 00:26:28.745 }, 00:26:28.745 "base_bdevs_list": [ 00:26:28.745 { 00:26:28.745 "name": "spare", 00:26:28.745 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:28.745 "is_configured": true, 00:26:28.745 "data_offset": 0, 00:26:28.745 "data_size": 65536 00:26:28.745 }, 00:26:28.745 { 00:26:28.745 "name": "BaseBdev2", 00:26:28.745 "uuid": "d523bd58-d85b-5a55-856d-ff963d54e5fa", 00:26:28.745 "is_configured": true, 00:26:28.745 "data_offset": 0, 00:26:28.745 "data_size": 65536 00:26:28.745 }, 00:26:28.745 { 00:26:28.745 "name": "BaseBdev3", 00:26:28.745 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:28.745 "is_configured": true, 00:26:28.745 "data_offset": 0, 00:26:28.745 "data_size": 65536 00:26:28.745 }, 00:26:28.745 { 00:26:28.745 "name": "BaseBdev4", 00:26:28.745 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:28.745 "is_configured": true, 00:26:28.745 "data_offset": 0, 00:26:28.745 "data_size": 65536 00:26:28.745 } 00:26:28.745 ] 00:26:28.745 }' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:28.745 22:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:28.745 [2024-07-12 22:33:39.054307] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:29.003 [2024-07-12 22:33:39.182707] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:29.003 [2024-07-12 22:33:39.277280] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:29.263 [2024-07-12 22:33:39.358865] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25a2970 00:26:29.263 [2024-07-12 22:33:39.358890] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2611540 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.263 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.263 [2024-07-12 22:33:39.481718] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.522 "name": "raid_bdev1", 00:26:29.522 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:29.522 "strip_size_kb": 0, 00:26:29.522 "state": "online", 00:26:29.522 "raid_level": "raid1", 00:26:29.522 "superblock": false, 00:26:29.522 "num_base_bdevs": 4, 00:26:29.522 "num_base_bdevs_discovered": 3, 00:26:29.522 "num_base_bdevs_operational": 3, 00:26:29.522 "process": { 00:26:29.522 "type": "rebuild", 00:26:29.522 "target": "spare", 00:26:29.522 "progress": { 00:26:29.522 "blocks": 22528, 00:26:29.522 "percent": 34 00:26:29.522 } 00:26:29.522 }, 00:26:29.522 "base_bdevs_list": [ 00:26:29.522 { 00:26:29.522 "name": "spare", 00:26:29.522 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:29.522 "is_configured": true, 00:26:29.522 "data_offset": 0, 00:26:29.522 "data_size": 65536 00:26:29.522 }, 00:26:29.522 { 00:26:29.522 "name": null, 00:26:29.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.522 "is_configured": false, 00:26:29.522 "data_offset": 0, 00:26:29.522 "data_size": 65536 00:26:29.522 }, 00:26:29.522 { 00:26:29.522 "name": "BaseBdev3", 00:26:29.522 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:29.522 "is_configured": true, 00:26:29.522 "data_offset": 0, 00:26:29.522 "data_size": 65536 00:26:29.522 }, 00:26:29.522 { 00:26:29.522 "name": "BaseBdev4", 00:26:29.522 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:29.522 "is_configured": true, 00:26:29.522 "data_offset": 0, 00:26:29.522 "data_size": 65536 00:26:29.522 } 00:26:29.522 ] 00:26:29.522 }' 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=923 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.522 22:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.522 [2024-07-12 22:33:39.762068] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:30.089 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.089 "name": "raid_bdev1", 00:26:30.089 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:30.089 "strip_size_kb": 0, 00:26:30.089 "state": "online", 00:26:30.089 "raid_level": "raid1", 00:26:30.089 "superblock": false, 00:26:30.089 "num_base_bdevs": 4, 00:26:30.089 "num_base_bdevs_discovered": 3, 00:26:30.089 "num_base_bdevs_operational": 3, 00:26:30.089 "process": { 00:26:30.089 "type": "rebuild", 00:26:30.089 "target": "spare", 00:26:30.089 "progress": { 00:26:30.089 "blocks": 32768, 00:26:30.089 "percent": 50 00:26:30.089 } 00:26:30.089 }, 00:26:30.089 "base_bdevs_list": [ 00:26:30.089 { 00:26:30.089 "name": "spare", 00:26:30.089 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:30.089 "is_configured": true, 00:26:30.089 "data_offset": 0, 00:26:30.089 "data_size": 65536 00:26:30.089 }, 00:26:30.089 { 00:26:30.089 "name": null, 00:26:30.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.089 "is_configured": false, 00:26:30.089 "data_offset": 0, 00:26:30.089 "data_size": 65536 00:26:30.089 }, 00:26:30.089 { 00:26:30.089 "name": "BaseBdev3", 00:26:30.089 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:30.089 "is_configured": true, 00:26:30.089 "data_offset": 0, 00:26:30.089 "data_size": 65536 00:26:30.089 }, 00:26:30.089 { 00:26:30.089 "name": "BaseBdev4", 00:26:30.089 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:30.089 "is_configured": true, 00:26:30.089 "data_offset": 0, 00:26:30.089 "data_size": 65536 00:26:30.089 } 00:26:30.089 ] 00:26:30.089 }' 00:26:30.089 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.090 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:30.090 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.090 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:30.090 22:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:31.024 [2024-07-12 22:33:41.210386] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.024 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.283 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.283 "name": "raid_bdev1", 00:26:31.283 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:31.283 "strip_size_kb": 0, 00:26:31.283 "state": "online", 00:26:31.283 "raid_level": "raid1", 00:26:31.283 "superblock": false, 00:26:31.283 "num_base_bdevs": 4, 00:26:31.283 "num_base_bdevs_discovered": 3, 00:26:31.283 "num_base_bdevs_operational": 3, 00:26:31.283 "process": { 00:26:31.283 "type": "rebuild", 00:26:31.283 "target": "spare", 00:26:31.283 "progress": { 00:26:31.283 "blocks": 57344, 00:26:31.283 "percent": 87 00:26:31.283 } 00:26:31.283 }, 00:26:31.283 "base_bdevs_list": [ 00:26:31.283 { 00:26:31.283 "name": "spare", 00:26:31.283 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:31.283 "is_configured": true, 00:26:31.283 "data_offset": 0, 00:26:31.283 "data_size": 65536 00:26:31.283 }, 00:26:31.283 { 00:26:31.283 "name": null, 00:26:31.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.283 "is_configured": false, 00:26:31.283 "data_offset": 0, 00:26:31.283 "data_size": 65536 00:26:31.283 }, 00:26:31.283 { 00:26:31.283 "name": "BaseBdev3", 00:26:31.283 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:31.283 "is_configured": true, 00:26:31.283 "data_offset": 0, 00:26:31.283 "data_size": 65536 00:26:31.283 }, 00:26:31.283 { 00:26:31.283 "name": "BaseBdev4", 00:26:31.283 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:31.283 "is_configured": true, 00:26:31.283 "data_offset": 0, 00:26:31.283 "data_size": 65536 00:26:31.283 } 00:26:31.283 ] 00:26:31.283 }' 00:26:31.283 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.542 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.542 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.542 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.542 22:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:31.801 [2024-07-12 22:33:41.947076] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:31.801 [2024-07-12 22:33:42.002512] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:31.801 [2024-07-12 22:33:42.003830] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:32.368 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.631 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.631 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.631 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.631 "name": "raid_bdev1", 00:26:32.631 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:32.631 "strip_size_kb": 0, 00:26:32.631 "state": "online", 00:26:32.631 "raid_level": "raid1", 00:26:32.631 "superblock": false, 00:26:32.631 "num_base_bdevs": 4, 00:26:32.631 "num_base_bdevs_discovered": 3, 00:26:32.631 "num_base_bdevs_operational": 3, 00:26:32.631 "base_bdevs_list": [ 00:26:32.631 { 00:26:32.631 "name": "spare", 00:26:32.631 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:32.631 "is_configured": true, 00:26:32.631 "data_offset": 0, 00:26:32.631 "data_size": 65536 00:26:32.631 }, 00:26:32.631 { 00:26:32.631 "name": null, 00:26:32.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.631 "is_configured": false, 00:26:32.631 "data_offset": 0, 00:26:32.631 "data_size": 65536 00:26:32.631 }, 00:26:32.631 { 00:26:32.631 "name": "BaseBdev3", 00:26:32.631 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:32.631 "is_configured": true, 00:26:32.631 "data_offset": 0, 00:26:32.631 "data_size": 65536 00:26:32.631 }, 00:26:32.631 { 00:26:32.631 "name": "BaseBdev4", 00:26:32.631 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:32.631 "is_configured": true, 00:26:32.631 "data_offset": 0, 00:26:32.631 "data_size": 65536 00:26:32.631 } 00:26:32.631 ] 00:26:32.631 }' 00:26:32.631 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.896 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:32.896 22:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.896 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.155 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:33.155 "name": "raid_bdev1", 00:26:33.155 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:33.155 "strip_size_kb": 0, 00:26:33.155 "state": "online", 00:26:33.155 "raid_level": "raid1", 00:26:33.155 "superblock": false, 00:26:33.155 "num_base_bdevs": 4, 00:26:33.155 "num_base_bdevs_discovered": 3, 00:26:33.155 "num_base_bdevs_operational": 3, 00:26:33.155 "base_bdevs_list": [ 00:26:33.155 { 00:26:33.155 "name": "spare", 00:26:33.155 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:33.155 "is_configured": true, 00:26:33.155 "data_offset": 0, 00:26:33.155 "data_size": 65536 00:26:33.155 }, 00:26:33.155 { 00:26:33.155 "name": null, 00:26:33.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.155 "is_configured": false, 00:26:33.155 "data_offset": 0, 00:26:33.155 "data_size": 65536 00:26:33.156 }, 00:26:33.156 { 00:26:33.156 "name": "BaseBdev3", 00:26:33.156 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:33.156 "is_configured": true, 00:26:33.156 "data_offset": 0, 00:26:33.156 "data_size": 65536 00:26:33.156 }, 00:26:33.156 { 00:26:33.156 "name": "BaseBdev4", 00:26:33.156 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:33.156 "is_configured": true, 00:26:33.156 "data_offset": 0, 00:26:33.156 "data_size": 65536 00:26:33.156 } 00:26:33.156 ] 00:26:33.156 }' 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.156 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.415 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.415 "name": "raid_bdev1", 00:26:33.415 "uuid": "45c19657-3bce-4c00-9ffc-8598afa523db", 00:26:33.415 "strip_size_kb": 0, 00:26:33.415 "state": "online", 00:26:33.415 "raid_level": "raid1", 00:26:33.415 "superblock": false, 00:26:33.415 "num_base_bdevs": 4, 00:26:33.415 "num_base_bdevs_discovered": 3, 00:26:33.415 "num_base_bdevs_operational": 3, 00:26:33.415 "base_bdevs_list": [ 00:26:33.415 { 00:26:33.415 "name": "spare", 00:26:33.415 "uuid": "28b9a029-10dc-5305-8fda-6e095eb7959c", 00:26:33.415 "is_configured": true, 00:26:33.415 "data_offset": 0, 00:26:33.415 "data_size": 65536 00:26:33.415 }, 00:26:33.415 { 00:26:33.415 "name": null, 00:26:33.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.415 "is_configured": false, 00:26:33.415 "data_offset": 0, 00:26:33.415 "data_size": 65536 00:26:33.415 }, 00:26:33.415 { 00:26:33.415 "name": "BaseBdev3", 00:26:33.415 "uuid": "54c042ac-4a4e-5c9d-bb88-a387b719fa67", 00:26:33.415 "is_configured": true, 00:26:33.415 "data_offset": 0, 00:26:33.415 "data_size": 65536 00:26:33.415 }, 00:26:33.415 { 00:26:33.415 "name": "BaseBdev4", 00:26:33.415 "uuid": "d2d246ac-addf-5d67-bc77-587d2e0546fa", 00:26:33.415 "is_configured": true, 00:26:33.415 "data_offset": 0, 00:26:33.415 "data_size": 65536 00:26:33.415 } 00:26:33.415 ] 00:26:33.415 }' 00:26:33.415 22:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.415 22:33:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:33.983 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:34.242 [2024-07-12 22:33:44.358229] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:34.242 [2024-07-12 22:33:44.358265] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:34.242 00:26:34.242 Latency(us) 00:26:34.242 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:34.242 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:34.242 raid_bdev1 : 11.39 95.37 286.11 0.00 0.00 14550.41 299.19 112607.94 00:26:34.242 =================================================================================================================== 00:26:34.242 Total : 95.37 286.11 0.00 0.00 14550.41 299.19 112607.94 00:26:34.242 [2024-07-12 22:33:44.458460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:34.242 [2024-07-12 22:33:44.458492] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:34.242 [2024-07-12 22:33:44.458586] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:34.242 [2024-07-12 22:33:44.458598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x259c8a0 name raid_bdev1, state offline 00:26:34.242 0 00:26:34.242 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.242 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:34.501 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:34.761 /dev/nbd0 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:34.761 1+0 records in 00:26:34.761 1+0 records out 00:26:34.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230703 s, 17.8 MB/s 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:34.761 22:33:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:35.020 /dev/nbd1 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.020 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.021 1+0 records in 00:26:35.021 1+0 records out 00:26:35.021 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288216 s, 14.2 MB/s 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:35.021 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.280 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:35.539 /dev/nbd1 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.539 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.539 1+0 records in 00:26:35.539 1+0 records out 00:26:35.539 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266006 s, 15.4 MB/s 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:35.540 22:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:35.798 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:36.069 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.070 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 3552110 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 3552110 ']' 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 3552110 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3552110 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3552110' 00:26:36.328 killing process with pid 3552110 00:26:36.328 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 3552110 00:26:36.328 Received shutdown signal, test time was about 13.385521 seconds 00:26:36.328 00:26:36.329 Latency(us) 00:26:36.329 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.329 =================================================================================================================== 00:26:36.329 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:36.329 [2024-07-12 22:33:46.457510] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:36.329 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 3552110 00:26:36.329 [2024-07-12 22:33:46.505861] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:36.588 00:26:36.588 real 0m19.079s 00:26:36.588 user 0m29.561s 00:26:36.588 sys 0m3.422s 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:36.588 ************************************ 00:26:36.588 END TEST raid_rebuild_test_io 00:26:36.588 ************************************ 00:26:36.588 22:33:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:36.588 22:33:46 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:36.588 22:33:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:36.588 22:33:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:36.588 22:33:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:36.588 ************************************ 00:26:36.588 START TEST raid_rebuild_test_sb_io 00:26:36.588 ************************************ 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=3554814 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 3554814 /var/tmp/spdk-raid.sock 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 3554814 ']' 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:36.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:36.588 22:33:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:36.588 [2024-07-12 22:33:46.889349] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:26:36.588 [2024-07-12 22:33:46.889416] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3554814 ] 00:26:36.588 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:36.588 Zero copy mechanism will not be used. 00:26:36.848 [2024-07-12 22:33:47.008204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.848 [2024-07-12 22:33:47.114514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:37.107 [2024-07-12 22:33:47.181993] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:37.107 [2024-07-12 22:33:47.182035] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:37.674 22:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:37.674 22:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:37.674 22:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:37.674 22:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:37.933 BaseBdev1_malloc 00:26:37.933 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:37.933 [2024-07-12 22:33:48.258041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:37.933 [2024-07-12 22:33:48.258088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.933 [2024-07-12 22:33:48.258113] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e61d40 00:26:37.933 [2024-07-12 22:33:48.258126] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:38.192 [2024-07-12 22:33:48.259863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:38.192 [2024-07-12 22:33:48.259890] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:38.192 BaseBdev1 00:26:38.192 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:38.192 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:38.192 BaseBdev2_malloc 00:26:38.192 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:38.450 [2024-07-12 22:33:48.733449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:38.450 [2024-07-12 22:33:48.733495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:38.450 [2024-07-12 22:33:48.733519] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e62860 00:26:38.450 [2024-07-12 22:33:48.733532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:38.450 [2024-07-12 22:33:48.735125] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:38.450 [2024-07-12 22:33:48.735155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:38.450 BaseBdev2 00:26:38.451 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:38.451 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:38.709 BaseBdev3_malloc 00:26:38.709 22:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:38.986 [2024-07-12 22:33:49.208549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:38.986 [2024-07-12 22:33:49.208595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:38.986 [2024-07-12 22:33:49.208616] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x200f8f0 00:26:38.986 [2024-07-12 22:33:49.208628] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:38.986 [2024-07-12 22:33:49.210188] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:38.986 [2024-07-12 22:33:49.210217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:38.986 BaseBdev3 00:26:38.986 22:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:38.986 22:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:39.247 BaseBdev4_malloc 00:26:39.247 22:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:39.505 [2024-07-12 22:33:49.678369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:39.505 [2024-07-12 22:33:49.678415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:39.505 [2024-07-12 22:33:49.678436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x200ead0 00:26:39.505 [2024-07-12 22:33:49.678448] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:39.505 [2024-07-12 22:33:49.679985] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:39.505 [2024-07-12 22:33:49.680012] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:39.505 BaseBdev4 00:26:39.505 22:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:39.763 spare_malloc 00:26:39.763 22:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:40.022 spare_delay 00:26:40.022 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:40.280 [2024-07-12 22:33:50.396842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:40.280 [2024-07-12 22:33:50.396886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:40.280 [2024-07-12 22:33:50.396907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20135b0 00:26:40.280 [2024-07-12 22:33:50.396944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:40.280 [2024-07-12 22:33:50.398530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:40.280 [2024-07-12 22:33:50.398560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:40.280 spare 00:26:40.281 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:40.540 [2024-07-12 22:33:50.637512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:40.540 [2024-07-12 22:33:50.638877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:40.540 [2024-07-12 22:33:50.638942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:40.540 [2024-07-12 22:33:50.638988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:40.540 [2024-07-12 22:33:50.639196] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f928a0 00:26:40.540 [2024-07-12 22:33:50.639208] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:40.540 [2024-07-12 22:33:50.639414] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200ce10 00:26:40.540 [2024-07-12 22:33:50.639571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f928a0 00:26:40.540 [2024-07-12 22:33:50.639582] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f928a0 00:26:40.540 [2024-07-12 22:33:50.639683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:40.540 "name": "raid_bdev1", 00:26:40.540 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:40.540 "strip_size_kb": 0, 00:26:40.540 "state": "online", 00:26:40.540 "raid_level": "raid1", 00:26:40.540 "superblock": true, 00:26:40.540 "num_base_bdevs": 4, 00:26:40.540 "num_base_bdevs_discovered": 4, 00:26:40.540 "num_base_bdevs_operational": 4, 00:26:40.540 "base_bdevs_list": [ 00:26:40.540 { 00:26:40.540 "name": "BaseBdev1", 00:26:40.540 "uuid": "ed9ad5ab-cacb-5578-aa0c-0bfff6b3c1b4", 00:26:40.540 "is_configured": true, 00:26:40.540 "data_offset": 2048, 00:26:40.540 "data_size": 63488 00:26:40.540 }, 00:26:40.540 { 00:26:40.540 "name": "BaseBdev2", 00:26:40.540 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:40.540 "is_configured": true, 00:26:40.540 "data_offset": 2048, 00:26:40.540 "data_size": 63488 00:26:40.540 }, 00:26:40.540 { 00:26:40.540 "name": "BaseBdev3", 00:26:40.540 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:40.540 "is_configured": true, 00:26:40.540 "data_offset": 2048, 00:26:40.540 "data_size": 63488 00:26:40.540 }, 00:26:40.540 { 00:26:40.540 "name": "BaseBdev4", 00:26:40.540 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:40.540 "is_configured": true, 00:26:40.540 "data_offset": 2048, 00:26:40.540 "data_size": 63488 00:26:40.540 } 00:26:40.540 ] 00:26:40.540 }' 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:40.540 22:33:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:41.108 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:41.108 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:41.367 [2024-07-12 22:33:51.564381] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:41.367 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:41.367 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.367 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:41.627 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:41.627 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:41.627 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:41.627 22:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:41.627 [2024-07-12 22:33:51.947222] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e61670 00:26:41.627 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:41.627 Zero copy mechanism will not be used. 00:26:41.627 Running I/O for 60 seconds... 00:26:41.887 [2024-07-12 22:33:52.059047] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:41.887 [2024-07-12 22:33:52.067281] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1e61670 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.887 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.146 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.146 "name": "raid_bdev1", 00:26:42.146 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:42.146 "strip_size_kb": 0, 00:26:42.146 "state": "online", 00:26:42.146 "raid_level": "raid1", 00:26:42.146 "superblock": true, 00:26:42.146 "num_base_bdevs": 4, 00:26:42.146 "num_base_bdevs_discovered": 3, 00:26:42.146 "num_base_bdevs_operational": 3, 00:26:42.146 "base_bdevs_list": [ 00:26:42.146 { 00:26:42.146 "name": null, 00:26:42.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.146 "is_configured": false, 00:26:42.146 "data_offset": 2048, 00:26:42.146 "data_size": 63488 00:26:42.146 }, 00:26:42.146 { 00:26:42.146 "name": "BaseBdev2", 00:26:42.146 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:42.146 "is_configured": true, 00:26:42.146 "data_offset": 2048, 00:26:42.146 "data_size": 63488 00:26:42.146 }, 00:26:42.146 { 00:26:42.146 "name": "BaseBdev3", 00:26:42.146 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:42.146 "is_configured": true, 00:26:42.146 "data_offset": 2048, 00:26:42.146 "data_size": 63488 00:26:42.146 }, 00:26:42.146 { 00:26:42.146 "name": "BaseBdev4", 00:26:42.146 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:42.146 "is_configured": true, 00:26:42.146 "data_offset": 2048, 00:26:42.146 "data_size": 63488 00:26:42.146 } 00:26:42.146 ] 00:26:42.146 }' 00:26:42.146 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.146 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:42.733 22:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:43.023 [2024-07-12 22:33:53.220162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:43.023 22:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:43.023 [2024-07-12 22:33:53.276823] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f94b40 00:26:43.023 [2024-07-12 22:33:53.279238] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:43.309 [2024-07-12 22:33:53.398812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:43.309 [2024-07-12 22:33:53.400128] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:43.309 [2024-07-12 22:33:53.632917] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:43.309 [2024-07-12 22:33:53.633637] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:43.877 [2024-07-12 22:33:54.000121] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:43.877 [2024-07-12 22:33:54.000454] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:44.136 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:44.136 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:44.136 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:44.136 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:44.136 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:44.137 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.137 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.137 [2024-07-12 22:33:54.388360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:44.396 [2024-07-12 22:33:54.489769] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:44.396 [2024-07-12 22:33:54.490069] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.396 "name": "raid_bdev1", 00:26:44.396 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:44.396 "strip_size_kb": 0, 00:26:44.396 "state": "online", 00:26:44.396 "raid_level": "raid1", 00:26:44.396 "superblock": true, 00:26:44.396 "num_base_bdevs": 4, 00:26:44.396 "num_base_bdevs_discovered": 4, 00:26:44.396 "num_base_bdevs_operational": 4, 00:26:44.396 "process": { 00:26:44.396 "type": "rebuild", 00:26:44.396 "target": "spare", 00:26:44.396 "progress": { 00:26:44.396 "blocks": 16384, 00:26:44.396 "percent": 25 00:26:44.396 } 00:26:44.396 }, 00:26:44.396 "base_bdevs_list": [ 00:26:44.396 { 00:26:44.396 "name": "spare", 00:26:44.396 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:44.396 "is_configured": true, 00:26:44.396 "data_offset": 2048, 00:26:44.396 "data_size": 63488 00:26:44.396 }, 00:26:44.396 { 00:26:44.396 "name": "BaseBdev2", 00:26:44.396 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:44.396 "is_configured": true, 00:26:44.396 "data_offset": 2048, 00:26:44.396 "data_size": 63488 00:26:44.396 }, 00:26:44.396 { 00:26:44.396 "name": "BaseBdev3", 00:26:44.396 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:44.396 "is_configured": true, 00:26:44.396 "data_offset": 2048, 00:26:44.396 "data_size": 63488 00:26:44.396 }, 00:26:44.396 { 00:26:44.396 "name": "BaseBdev4", 00:26:44.396 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:44.396 "is_configured": true, 00:26:44.396 "data_offset": 2048, 00:26:44.396 "data_size": 63488 00:26:44.396 } 00:26:44.396 ] 00:26:44.396 }' 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.396 22:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:44.656 [2024-07-12 22:33:54.754060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:44.656 [2024-07-12 22:33:54.787436] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:44.656 [2024-07-12 22:33:54.866060] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:44.656 [2024-07-12 22:33:54.867267] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:44.656 [2024-07-12 22:33:54.976860] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:44.656 [2024-07-12 22:33:54.980158] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:44.656 [2024-07-12 22:33:54.980189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:44.656 [2024-07-12 22:33:54.980199] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:44.915 [2024-07-12 22:33:54.986353] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1e61670 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.915 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.174 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.174 "name": "raid_bdev1", 00:26:45.174 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:45.174 "strip_size_kb": 0, 00:26:45.174 "state": "online", 00:26:45.174 "raid_level": "raid1", 00:26:45.174 "superblock": true, 00:26:45.174 "num_base_bdevs": 4, 00:26:45.174 "num_base_bdevs_discovered": 3, 00:26:45.174 "num_base_bdevs_operational": 3, 00:26:45.174 "base_bdevs_list": [ 00:26:45.174 { 00:26:45.174 "name": null, 00:26:45.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.174 "is_configured": false, 00:26:45.174 "data_offset": 2048, 00:26:45.174 "data_size": 63488 00:26:45.174 }, 00:26:45.174 { 00:26:45.174 "name": "BaseBdev2", 00:26:45.174 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:45.174 "is_configured": true, 00:26:45.174 "data_offset": 2048, 00:26:45.174 "data_size": 63488 00:26:45.174 }, 00:26:45.174 { 00:26:45.174 "name": "BaseBdev3", 00:26:45.174 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:45.174 "is_configured": true, 00:26:45.174 "data_offset": 2048, 00:26:45.174 "data_size": 63488 00:26:45.174 }, 00:26:45.174 { 00:26:45.174 "name": "BaseBdev4", 00:26:45.174 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:45.174 "is_configured": true, 00:26:45.174 "data_offset": 2048, 00:26:45.174 "data_size": 63488 00:26:45.174 } 00:26:45.174 ] 00:26:45.174 }' 00:26:45.174 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.174 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.740 22:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.065 "name": "raid_bdev1", 00:26:46.065 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:46.065 "strip_size_kb": 0, 00:26:46.065 "state": "online", 00:26:46.065 "raid_level": "raid1", 00:26:46.065 "superblock": true, 00:26:46.065 "num_base_bdevs": 4, 00:26:46.065 "num_base_bdevs_discovered": 3, 00:26:46.065 "num_base_bdevs_operational": 3, 00:26:46.065 "base_bdevs_list": [ 00:26:46.065 { 00:26:46.065 "name": null, 00:26:46.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.065 "is_configured": false, 00:26:46.065 "data_offset": 2048, 00:26:46.065 "data_size": 63488 00:26:46.065 }, 00:26:46.065 { 00:26:46.065 "name": "BaseBdev2", 00:26:46.065 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:46.065 "is_configured": true, 00:26:46.065 "data_offset": 2048, 00:26:46.065 "data_size": 63488 00:26:46.065 }, 00:26:46.065 { 00:26:46.065 "name": "BaseBdev3", 00:26:46.065 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:46.065 "is_configured": true, 00:26:46.065 "data_offset": 2048, 00:26:46.065 "data_size": 63488 00:26:46.065 }, 00:26:46.065 { 00:26:46.065 "name": "BaseBdev4", 00:26:46.065 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:46.065 "is_configured": true, 00:26:46.065 "data_offset": 2048, 00:26:46.065 "data_size": 63488 00:26:46.065 } 00:26:46.065 ] 00:26:46.065 }' 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:46.065 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:46.324 [2024-07-12 22:33:56.460114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:46.324 22:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:46.324 [2024-07-12 22:33:56.525030] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2007bc0 00:26:46.324 [2024-07-12 22:33:56.526521] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:46.582 [2024-07-12 22:33:56.665296] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:46.582 [2024-07-12 22:33:56.665794] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:46.582 [2024-07-12 22:33:56.778994] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:46.582 [2024-07-12 22:33:56.779580] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:46.840 [2024-07-12 22:33:57.126126] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.407 [2024-07-12 22:33:57.525615] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:47.407 [2024-07-12 22:33:57.525942] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.407 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.665 "name": "raid_bdev1", 00:26:47.665 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:47.665 "strip_size_kb": 0, 00:26:47.665 "state": "online", 00:26:47.665 "raid_level": "raid1", 00:26:47.665 "superblock": true, 00:26:47.665 "num_base_bdevs": 4, 00:26:47.665 "num_base_bdevs_discovered": 4, 00:26:47.665 "num_base_bdevs_operational": 4, 00:26:47.665 "process": { 00:26:47.665 "type": "rebuild", 00:26:47.665 "target": "spare", 00:26:47.665 "progress": { 00:26:47.665 "blocks": 16384, 00:26:47.665 "percent": 25 00:26:47.665 } 00:26:47.665 }, 00:26:47.665 "base_bdevs_list": [ 00:26:47.665 { 00:26:47.665 "name": "spare", 00:26:47.665 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:47.665 "is_configured": true, 00:26:47.665 "data_offset": 2048, 00:26:47.665 "data_size": 63488 00:26:47.665 }, 00:26:47.665 { 00:26:47.665 "name": "BaseBdev2", 00:26:47.665 "uuid": "0bf9da65-31ee-5691-8734-703a1b69cadb", 00:26:47.665 "is_configured": true, 00:26:47.665 "data_offset": 2048, 00:26:47.665 "data_size": 63488 00:26:47.665 }, 00:26:47.665 { 00:26:47.665 "name": "BaseBdev3", 00:26:47.665 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:47.665 "is_configured": true, 00:26:47.665 "data_offset": 2048, 00:26:47.665 "data_size": 63488 00:26:47.665 }, 00:26:47.665 { 00:26:47.665 "name": "BaseBdev4", 00:26:47.665 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:47.665 "is_configured": true, 00:26:47.665 "data_offset": 2048, 00:26:47.665 "data_size": 63488 00:26:47.665 } 00:26:47.665 ] 00:26:47.665 }' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:47.665 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:47.665 22:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:47.923 [2024-07-12 22:33:58.077277] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:47.923 [2024-07-12 22:33:58.077534] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:47.923 [2024-07-12 22:33:58.093616] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:48.182 [2024-07-12 22:33:58.317807] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1e61670 00:26:48.182 [2024-07-12 22:33:58.317835] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2007bc0 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.182 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.182 [2024-07-12 22:33:58.440784] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.441 "name": "raid_bdev1", 00:26:48.441 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:48.441 "strip_size_kb": 0, 00:26:48.441 "state": "online", 00:26:48.441 "raid_level": "raid1", 00:26:48.441 "superblock": true, 00:26:48.441 "num_base_bdevs": 4, 00:26:48.441 "num_base_bdevs_discovered": 3, 00:26:48.441 "num_base_bdevs_operational": 3, 00:26:48.441 "process": { 00:26:48.441 "type": "rebuild", 00:26:48.441 "target": "spare", 00:26:48.441 "progress": { 00:26:48.441 "blocks": 26624, 00:26:48.441 "percent": 41 00:26:48.441 } 00:26:48.441 }, 00:26:48.441 "base_bdevs_list": [ 00:26:48.441 { 00:26:48.441 "name": "spare", 00:26:48.441 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:48.441 "is_configured": true, 00:26:48.441 "data_offset": 2048, 00:26:48.441 "data_size": 63488 00:26:48.441 }, 00:26:48.441 { 00:26:48.441 "name": null, 00:26:48.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.441 "is_configured": false, 00:26:48.441 "data_offset": 2048, 00:26:48.441 "data_size": 63488 00:26:48.441 }, 00:26:48.441 { 00:26:48.441 "name": "BaseBdev3", 00:26:48.441 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:48.441 "is_configured": true, 00:26:48.441 "data_offset": 2048, 00:26:48.441 "data_size": 63488 00:26:48.441 }, 00:26:48.441 { 00:26:48.441 "name": "BaseBdev4", 00:26:48.441 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:48.441 "is_configured": true, 00:26:48.441 "data_offset": 2048, 00:26:48.441 "data_size": 63488 00:26:48.441 } 00:26:48.441 ] 00:26:48.441 }' 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=942 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:48.441 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.442 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:48.442 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:48.442 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.442 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.442 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.701 [2024-07-12 22:33:58.910385] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:48.701 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.701 "name": "raid_bdev1", 00:26:48.701 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:48.701 "strip_size_kb": 0, 00:26:48.701 "state": "online", 00:26:48.701 "raid_level": "raid1", 00:26:48.701 "superblock": true, 00:26:48.701 "num_base_bdevs": 4, 00:26:48.701 "num_base_bdevs_discovered": 3, 00:26:48.701 "num_base_bdevs_operational": 3, 00:26:48.701 "process": { 00:26:48.701 "type": "rebuild", 00:26:48.701 "target": "spare", 00:26:48.701 "progress": { 00:26:48.701 "blocks": 34816, 00:26:48.701 "percent": 54 00:26:48.701 } 00:26:48.701 }, 00:26:48.701 "base_bdevs_list": [ 00:26:48.701 { 00:26:48.701 "name": "spare", 00:26:48.701 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:48.701 "is_configured": true, 00:26:48.701 "data_offset": 2048, 00:26:48.701 "data_size": 63488 00:26:48.701 }, 00:26:48.701 { 00:26:48.701 "name": null, 00:26:48.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.701 "is_configured": false, 00:26:48.701 "data_offset": 2048, 00:26:48.701 "data_size": 63488 00:26:48.701 }, 00:26:48.701 { 00:26:48.701 "name": "BaseBdev3", 00:26:48.701 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:48.701 "is_configured": true, 00:26:48.701 "data_offset": 2048, 00:26:48.701 "data_size": 63488 00:26:48.701 }, 00:26:48.701 { 00:26:48.701 "name": "BaseBdev4", 00:26:48.701 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:48.701 "is_configured": true, 00:26:48.701 "data_offset": 2048, 00:26:48.701 "data_size": 63488 00:26:48.701 } 00:26:48.701 ] 00:26:48.701 }' 00:26:48.701 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.701 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:48.701 22:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.960 22:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:48.960 22:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:48.960 [2024-07-12 22:33:59.131040] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:48.960 [2024-07-12 22:33:59.251982] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.895 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.153 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.153 "name": "raid_bdev1", 00:26:50.153 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:50.153 "strip_size_kb": 0, 00:26:50.153 "state": "online", 00:26:50.153 "raid_level": "raid1", 00:26:50.153 "superblock": true, 00:26:50.153 "num_base_bdevs": 4, 00:26:50.153 "num_base_bdevs_discovered": 3, 00:26:50.153 "num_base_bdevs_operational": 3, 00:26:50.153 "process": { 00:26:50.153 "type": "rebuild", 00:26:50.153 "target": "spare", 00:26:50.153 "progress": { 00:26:50.153 "blocks": 57344, 00:26:50.153 "percent": 90 00:26:50.153 } 00:26:50.153 }, 00:26:50.153 "base_bdevs_list": [ 00:26:50.153 { 00:26:50.153 "name": "spare", 00:26:50.153 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:50.153 "is_configured": true, 00:26:50.153 "data_offset": 2048, 00:26:50.153 "data_size": 63488 00:26:50.153 }, 00:26:50.153 { 00:26:50.153 "name": null, 00:26:50.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.153 "is_configured": false, 00:26:50.153 "data_offset": 2048, 00:26:50.154 "data_size": 63488 00:26:50.154 }, 00:26:50.154 { 00:26:50.154 "name": "BaseBdev3", 00:26:50.154 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:50.154 "is_configured": true, 00:26:50.154 "data_offset": 2048, 00:26:50.154 "data_size": 63488 00:26:50.154 }, 00:26:50.154 { 00:26:50.154 "name": "BaseBdev4", 00:26:50.154 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:50.154 "is_configured": true, 00:26:50.154 "data_offset": 2048, 00:26:50.154 "data_size": 63488 00:26:50.154 } 00:26:50.154 ] 00:26:50.154 }' 00:26:50.154 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.154 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:50.154 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.154 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:50.154 22:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:50.412 [2024-07-12 22:34:00.498815] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:50.412 [2024-07-12 22:34:00.607081] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:50.412 [2024-07-12 22:34:00.610407] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.346 "name": "raid_bdev1", 00:26:51.346 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:51.346 "strip_size_kb": 0, 00:26:51.346 "state": "online", 00:26:51.346 "raid_level": "raid1", 00:26:51.346 "superblock": true, 00:26:51.346 "num_base_bdevs": 4, 00:26:51.346 "num_base_bdevs_discovered": 3, 00:26:51.346 "num_base_bdevs_operational": 3, 00:26:51.346 "base_bdevs_list": [ 00:26:51.346 { 00:26:51.346 "name": "spare", 00:26:51.346 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:51.346 "is_configured": true, 00:26:51.346 "data_offset": 2048, 00:26:51.346 "data_size": 63488 00:26:51.346 }, 00:26:51.346 { 00:26:51.346 "name": null, 00:26:51.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.346 "is_configured": false, 00:26:51.346 "data_offset": 2048, 00:26:51.346 "data_size": 63488 00:26:51.346 }, 00:26:51.346 { 00:26:51.346 "name": "BaseBdev3", 00:26:51.346 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:51.346 "is_configured": true, 00:26:51.346 "data_offset": 2048, 00:26:51.346 "data_size": 63488 00:26:51.346 }, 00:26:51.346 { 00:26:51.346 "name": "BaseBdev4", 00:26:51.346 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:51.346 "is_configured": true, 00:26:51.346 "data_offset": 2048, 00:26:51.346 "data_size": 63488 00:26:51.346 } 00:26:51.346 ] 00:26:51.346 }' 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.346 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.605 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.605 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.605 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.605 "name": "raid_bdev1", 00:26:51.605 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:51.605 "strip_size_kb": 0, 00:26:51.605 "state": "online", 00:26:51.605 "raid_level": "raid1", 00:26:51.605 "superblock": true, 00:26:51.605 "num_base_bdevs": 4, 00:26:51.605 "num_base_bdevs_discovered": 3, 00:26:51.605 "num_base_bdevs_operational": 3, 00:26:51.605 "base_bdevs_list": [ 00:26:51.605 { 00:26:51.605 "name": "spare", 00:26:51.605 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:51.605 "is_configured": true, 00:26:51.605 "data_offset": 2048, 00:26:51.605 "data_size": 63488 00:26:51.605 }, 00:26:51.605 { 00:26:51.605 "name": null, 00:26:51.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.605 "is_configured": false, 00:26:51.605 "data_offset": 2048, 00:26:51.605 "data_size": 63488 00:26:51.605 }, 00:26:51.605 { 00:26:51.605 "name": "BaseBdev3", 00:26:51.605 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:51.605 "is_configured": true, 00:26:51.605 "data_offset": 2048, 00:26:51.605 "data_size": 63488 00:26:51.605 }, 00:26:51.605 { 00:26:51.605 "name": "BaseBdev4", 00:26:51.605 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:51.605 "is_configured": true, 00:26:51.605 "data_offset": 2048, 00:26:51.605 "data_size": 63488 00:26:51.605 } 00:26:51.605 ] 00:26:51.605 }' 00:26:51.605 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.863 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:51.863 22:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.863 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.122 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.122 "name": "raid_bdev1", 00:26:52.122 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:52.122 "strip_size_kb": 0, 00:26:52.122 "state": "online", 00:26:52.122 "raid_level": "raid1", 00:26:52.122 "superblock": true, 00:26:52.122 "num_base_bdevs": 4, 00:26:52.122 "num_base_bdevs_discovered": 3, 00:26:52.122 "num_base_bdevs_operational": 3, 00:26:52.122 "base_bdevs_list": [ 00:26:52.122 { 00:26:52.122 "name": "spare", 00:26:52.122 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:52.122 "is_configured": true, 00:26:52.122 "data_offset": 2048, 00:26:52.122 "data_size": 63488 00:26:52.122 }, 00:26:52.122 { 00:26:52.122 "name": null, 00:26:52.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.122 "is_configured": false, 00:26:52.122 "data_offset": 2048, 00:26:52.122 "data_size": 63488 00:26:52.122 }, 00:26:52.122 { 00:26:52.122 "name": "BaseBdev3", 00:26:52.122 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:52.122 "is_configured": true, 00:26:52.122 "data_offset": 2048, 00:26:52.122 "data_size": 63488 00:26:52.122 }, 00:26:52.122 { 00:26:52.122 "name": "BaseBdev4", 00:26:52.122 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:52.122 "is_configured": true, 00:26:52.122 "data_offset": 2048, 00:26:52.122 "data_size": 63488 00:26:52.122 } 00:26:52.122 ] 00:26:52.122 }' 00:26:52.122 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.122 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:52.688 22:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:52.947 [2024-07-12 22:34:03.052574] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:52.947 [2024-07-12 22:34:03.052607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:52.947 00:26:52.947 Latency(us) 00:26:52.947 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:52.947 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:52.947 raid_bdev1 : 11.18 99.86 299.57 0.00 0.00 13765.99 295.62 113063.85 00:26:52.947 =================================================================================================================== 00:26:52.947 Total : 99.86 299.57 0.00 0.00 13765.99 295.62 113063.85 00:26:52.947 [2024-07-12 22:34:03.156769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:52.947 [2024-07-12 22:34:03.156798] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:52.947 [2024-07-12 22:34:03.156891] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:52.947 [2024-07-12 22:34:03.156903] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f928a0 name raid_bdev1, state offline 00:26:52.947 0 00:26:52.947 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.947 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:53.205 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:53.464 /dev/nbd0 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:53.464 1+0 records in 00:26:53.464 1+0 records out 00:26:53.464 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270829 s, 15.1 MB/s 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:53.464 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:53.722 /dev/nbd1 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:53.722 1+0 records in 00:26:53.722 1+0 records out 00:26:53.722 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000175786 s, 23.3 MB/s 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:53.722 22:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:53.722 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:53.980 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:54.238 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:54.496 /dev/nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:54.496 1+0 records in 00:26:54.496 1+0 records out 00:26:54.496 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029475 s, 13.9 MB/s 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:54.496 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:54.774 22:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:54.774 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:55.033 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:55.291 [2024-07-12 22:34:05.538257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:55.291 [2024-07-12 22:34:05.538303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:55.291 [2024-07-12 22:34:05.538324] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2011e80 00:26:55.291 [2024-07-12 22:34:05.538337] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:55.291 [2024-07-12 22:34:05.539964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:55.291 [2024-07-12 22:34:05.539992] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:55.291 [2024-07-12 22:34:05.540073] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:55.291 [2024-07-12 22:34:05.540100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.291 [2024-07-12 22:34:05.540210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:55.291 [2024-07-12 22:34:05.540285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:55.291 spare 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.291 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.550 [2024-07-12 22:34:05.640603] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f97ef0 00:26:55.550 [2024-07-12 22:34:05.640619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:55.550 [2024-07-12 22:34:05.640801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f92ce0 00:26:55.550 [2024-07-12 22:34:05.640955] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f97ef0 00:26:55.550 [2024-07-12 22:34:05.640966] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f97ef0 00:26:55.550 [2024-07-12 22:34:05.641073] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:55.550 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.550 "name": "raid_bdev1", 00:26:55.550 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:55.550 "strip_size_kb": 0, 00:26:55.550 "state": "online", 00:26:55.550 "raid_level": "raid1", 00:26:55.550 "superblock": true, 00:26:55.550 "num_base_bdevs": 4, 00:26:55.550 "num_base_bdevs_discovered": 3, 00:26:55.550 "num_base_bdevs_operational": 3, 00:26:55.550 "base_bdevs_list": [ 00:26:55.550 { 00:26:55.550 "name": "spare", 00:26:55.550 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:55.550 "is_configured": true, 00:26:55.550 "data_offset": 2048, 00:26:55.550 "data_size": 63488 00:26:55.550 }, 00:26:55.550 { 00:26:55.550 "name": null, 00:26:55.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.550 "is_configured": false, 00:26:55.551 "data_offset": 2048, 00:26:55.551 "data_size": 63488 00:26:55.551 }, 00:26:55.551 { 00:26:55.551 "name": "BaseBdev3", 00:26:55.551 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:55.551 "is_configured": true, 00:26:55.551 "data_offset": 2048, 00:26:55.551 "data_size": 63488 00:26:55.551 }, 00:26:55.551 { 00:26:55.551 "name": "BaseBdev4", 00:26:55.551 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:55.551 "is_configured": true, 00:26:55.551 "data_offset": 2048, 00:26:55.551 "data_size": 63488 00:26:55.551 } 00:26:55.551 ] 00:26:55.551 }' 00:26:55.551 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.551 22:34:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.118 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.376 "name": "raid_bdev1", 00:26:56.376 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:56.376 "strip_size_kb": 0, 00:26:56.376 "state": "online", 00:26:56.376 "raid_level": "raid1", 00:26:56.376 "superblock": true, 00:26:56.376 "num_base_bdevs": 4, 00:26:56.376 "num_base_bdevs_discovered": 3, 00:26:56.376 "num_base_bdevs_operational": 3, 00:26:56.376 "base_bdevs_list": [ 00:26:56.376 { 00:26:56.376 "name": "spare", 00:26:56.376 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:56.376 "is_configured": true, 00:26:56.376 "data_offset": 2048, 00:26:56.376 "data_size": 63488 00:26:56.376 }, 00:26:56.376 { 00:26:56.376 "name": null, 00:26:56.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.376 "is_configured": false, 00:26:56.376 "data_offset": 2048, 00:26:56.376 "data_size": 63488 00:26:56.376 }, 00:26:56.376 { 00:26:56.376 "name": "BaseBdev3", 00:26:56.376 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:56.376 "is_configured": true, 00:26:56.376 "data_offset": 2048, 00:26:56.376 "data_size": 63488 00:26:56.376 }, 00:26:56.376 { 00:26:56.376 "name": "BaseBdev4", 00:26:56.376 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:56.376 "is_configured": true, 00:26:56.376 "data_offset": 2048, 00:26:56.376 "data_size": 63488 00:26:56.376 } 00:26:56.376 ] 00:26:56.376 }' 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:56.376 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.634 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.634 22:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:56.893 [2024-07-12 22:34:07.030695] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.893 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.152 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.152 "name": "raid_bdev1", 00:26:57.152 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:57.152 "strip_size_kb": 0, 00:26:57.152 "state": "online", 00:26:57.152 "raid_level": "raid1", 00:26:57.152 "superblock": true, 00:26:57.152 "num_base_bdevs": 4, 00:26:57.152 "num_base_bdevs_discovered": 2, 00:26:57.152 "num_base_bdevs_operational": 2, 00:26:57.152 "base_bdevs_list": [ 00:26:57.152 { 00:26:57.152 "name": null, 00:26:57.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.152 "is_configured": false, 00:26:57.152 "data_offset": 2048, 00:26:57.152 "data_size": 63488 00:26:57.152 }, 00:26:57.152 { 00:26:57.152 "name": null, 00:26:57.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.152 "is_configured": false, 00:26:57.152 "data_offset": 2048, 00:26:57.152 "data_size": 63488 00:26:57.152 }, 00:26:57.152 { 00:26:57.152 "name": "BaseBdev3", 00:26:57.152 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:57.152 "is_configured": true, 00:26:57.152 "data_offset": 2048, 00:26:57.152 "data_size": 63488 00:26:57.152 }, 00:26:57.152 { 00:26:57.152 "name": "BaseBdev4", 00:26:57.152 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:57.152 "is_configured": true, 00:26:57.152 "data_offset": 2048, 00:26:57.152 "data_size": 63488 00:26:57.152 } 00:26:57.152 ] 00:26:57.152 }' 00:26:57.152 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.152 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:57.720 22:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:57.979 [2024-07-12 22:34:08.118022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:57.979 [2024-07-12 22:34:08.118181] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:57.979 [2024-07-12 22:34:08.118198] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:57.979 [2024-07-12 22:34:08.118224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:57.979 [2024-07-12 22:34:08.123285] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b68fa0 00:26:57.979 [2024-07-12 22:34:08.125469] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:57.979 22:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.917 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.176 "name": "raid_bdev1", 00:26:59.176 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:59.176 "strip_size_kb": 0, 00:26:59.176 "state": "online", 00:26:59.176 "raid_level": "raid1", 00:26:59.176 "superblock": true, 00:26:59.176 "num_base_bdevs": 4, 00:26:59.176 "num_base_bdevs_discovered": 3, 00:26:59.176 "num_base_bdevs_operational": 3, 00:26:59.176 "process": { 00:26:59.176 "type": "rebuild", 00:26:59.176 "target": "spare", 00:26:59.176 "progress": { 00:26:59.176 "blocks": 24576, 00:26:59.176 "percent": 38 00:26:59.176 } 00:26:59.176 }, 00:26:59.176 "base_bdevs_list": [ 00:26:59.176 { 00:26:59.176 "name": "spare", 00:26:59.176 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:26:59.176 "is_configured": true, 00:26:59.176 "data_offset": 2048, 00:26:59.176 "data_size": 63488 00:26:59.176 }, 00:26:59.176 { 00:26:59.176 "name": null, 00:26:59.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.176 "is_configured": false, 00:26:59.176 "data_offset": 2048, 00:26:59.176 "data_size": 63488 00:26:59.176 }, 00:26:59.176 { 00:26:59.176 "name": "BaseBdev3", 00:26:59.176 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:59.176 "is_configured": true, 00:26:59.176 "data_offset": 2048, 00:26:59.176 "data_size": 63488 00:26:59.176 }, 00:26:59.176 { 00:26:59.176 "name": "BaseBdev4", 00:26:59.176 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:59.176 "is_configured": true, 00:26:59.176 "data_offset": 2048, 00:26:59.176 "data_size": 63488 00:26:59.176 } 00:26:59.176 ] 00:26:59.176 }' 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:59.176 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:59.435 [2024-07-12 22:34:09.688650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:59.435 [2024-07-12 22:34:09.738103] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:59.435 [2024-07-12 22:34:09.738147] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:59.435 [2024-07-12 22:34:09.738164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:59.435 [2024-07-12 22:34:09.738173] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.694 22:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.694 22:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:59.694 "name": "raid_bdev1", 00:26:59.694 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:26:59.694 "strip_size_kb": 0, 00:26:59.694 "state": "online", 00:26:59.694 "raid_level": "raid1", 00:26:59.694 "superblock": true, 00:26:59.694 "num_base_bdevs": 4, 00:26:59.694 "num_base_bdevs_discovered": 2, 00:26:59.694 "num_base_bdevs_operational": 2, 00:26:59.694 "base_bdevs_list": [ 00:26:59.694 { 00:26:59.694 "name": null, 00:26:59.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.694 "is_configured": false, 00:26:59.694 "data_offset": 2048, 00:26:59.694 "data_size": 63488 00:26:59.694 }, 00:26:59.694 { 00:26:59.694 "name": null, 00:26:59.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.694 "is_configured": false, 00:26:59.694 "data_offset": 2048, 00:26:59.694 "data_size": 63488 00:26:59.694 }, 00:26:59.694 { 00:26:59.694 "name": "BaseBdev3", 00:26:59.694 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:26:59.694 "is_configured": true, 00:26:59.694 "data_offset": 2048, 00:26:59.694 "data_size": 63488 00:26:59.694 }, 00:26:59.694 { 00:26:59.694 "name": "BaseBdev4", 00:26:59.694 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:26:59.694 "is_configured": true, 00:26:59.694 "data_offset": 2048, 00:26:59.694 "data_size": 63488 00:26:59.694 } 00:26:59.694 ] 00:26:59.694 }' 00:26:59.694 22:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:59.694 22:34:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:00.630 22:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:00.630 [2024-07-12 22:34:10.829346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:00.630 [2024-07-12 22:34:10.829397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:00.631 [2024-07-12 22:34:10.829421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f98360 00:27:00.631 [2024-07-12 22:34:10.829439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:00.631 [2024-07-12 22:34:10.829815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:00.631 [2024-07-12 22:34:10.829833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:00.631 [2024-07-12 22:34:10.829916] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:00.631 [2024-07-12 22:34:10.829939] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:00.631 [2024-07-12 22:34:10.829951] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:00.631 [2024-07-12 22:34:10.829971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:00.631 [2024-07-12 22:34:10.834381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2012320 00:27:00.631 spare 00:27:00.631 [2024-07-12 22:34:10.835856] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:00.631 22:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.567 22:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.827 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.827 "name": "raid_bdev1", 00:27:01.827 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:01.827 "strip_size_kb": 0, 00:27:01.827 "state": "online", 00:27:01.827 "raid_level": "raid1", 00:27:01.827 "superblock": true, 00:27:01.827 "num_base_bdevs": 4, 00:27:01.827 "num_base_bdevs_discovered": 3, 00:27:01.827 "num_base_bdevs_operational": 3, 00:27:01.827 "process": { 00:27:01.827 "type": "rebuild", 00:27:01.827 "target": "spare", 00:27:01.827 "progress": { 00:27:01.827 "blocks": 24576, 00:27:01.827 "percent": 38 00:27:01.827 } 00:27:01.827 }, 00:27:01.827 "base_bdevs_list": [ 00:27:01.827 { 00:27:01.827 "name": "spare", 00:27:01.827 "uuid": "354f828e-15cd-51f6-bc4e-a37d9f077932", 00:27:01.827 "is_configured": true, 00:27:01.827 "data_offset": 2048, 00:27:01.827 "data_size": 63488 00:27:01.827 }, 00:27:01.827 { 00:27:01.827 "name": null, 00:27:01.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.827 "is_configured": false, 00:27:01.827 "data_offset": 2048, 00:27:01.827 "data_size": 63488 00:27:01.827 }, 00:27:01.827 { 00:27:01.827 "name": "BaseBdev3", 00:27:01.827 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:01.827 "is_configured": true, 00:27:01.827 "data_offset": 2048, 00:27:01.827 "data_size": 63488 00:27:01.827 }, 00:27:01.827 { 00:27:01.827 "name": "BaseBdev4", 00:27:01.827 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:01.827 "is_configured": true, 00:27:01.827 "data_offset": 2048, 00:27:01.827 "data_size": 63488 00:27:01.827 } 00:27:01.827 ] 00:27:01.827 }' 00:27:01.827 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.827 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:01.827 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:02.087 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:02.087 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:02.346 [2024-07-12 22:34:12.416063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:02.346 [2024-07-12 22:34:12.448096] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:02.346 [2024-07-12 22:34:12.448143] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:02.346 [2024-07-12 22:34:12.448166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:02.346 [2024-07-12 22:34:12.448174] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.346 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.666 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:02.666 "name": "raid_bdev1", 00:27:02.666 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:02.666 "strip_size_kb": 0, 00:27:02.666 "state": "online", 00:27:02.666 "raid_level": "raid1", 00:27:02.666 "superblock": true, 00:27:02.666 "num_base_bdevs": 4, 00:27:02.666 "num_base_bdevs_discovered": 2, 00:27:02.666 "num_base_bdevs_operational": 2, 00:27:02.666 "base_bdevs_list": [ 00:27:02.666 { 00:27:02.666 "name": null, 00:27:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.666 "is_configured": false, 00:27:02.666 "data_offset": 2048, 00:27:02.666 "data_size": 63488 00:27:02.666 }, 00:27:02.666 { 00:27:02.666 "name": null, 00:27:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.666 "is_configured": false, 00:27:02.666 "data_offset": 2048, 00:27:02.666 "data_size": 63488 00:27:02.666 }, 00:27:02.666 { 00:27:02.666 "name": "BaseBdev3", 00:27:02.666 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:02.666 "is_configured": true, 00:27:02.666 "data_offset": 2048, 00:27:02.666 "data_size": 63488 00:27:02.666 }, 00:27:02.666 { 00:27:02.666 "name": "BaseBdev4", 00:27:02.666 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:02.666 "is_configured": true, 00:27:02.666 "data_offset": 2048, 00:27:02.666 "data_size": 63488 00:27:02.666 } 00:27:02.666 ] 00:27:02.666 }' 00:27:02.666 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:02.666 22:34:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.290 "name": "raid_bdev1", 00:27:03.290 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:03.290 "strip_size_kb": 0, 00:27:03.290 "state": "online", 00:27:03.290 "raid_level": "raid1", 00:27:03.290 "superblock": true, 00:27:03.290 "num_base_bdevs": 4, 00:27:03.290 "num_base_bdevs_discovered": 2, 00:27:03.290 "num_base_bdevs_operational": 2, 00:27:03.290 "base_bdevs_list": [ 00:27:03.290 { 00:27:03.290 "name": null, 00:27:03.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.290 "is_configured": false, 00:27:03.290 "data_offset": 2048, 00:27:03.290 "data_size": 63488 00:27:03.290 }, 00:27:03.290 { 00:27:03.290 "name": null, 00:27:03.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.290 "is_configured": false, 00:27:03.290 "data_offset": 2048, 00:27:03.290 "data_size": 63488 00:27:03.290 }, 00:27:03.290 { 00:27:03.290 "name": "BaseBdev3", 00:27:03.290 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:03.290 "is_configured": true, 00:27:03.290 "data_offset": 2048, 00:27:03.290 "data_size": 63488 00:27:03.290 }, 00:27:03.290 { 00:27:03.290 "name": "BaseBdev4", 00:27:03.290 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:03.290 "is_configured": true, 00:27:03.290 "data_offset": 2048, 00:27:03.290 "data_size": 63488 00:27:03.290 } 00:27:03.290 ] 00:27:03.290 }' 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:03.290 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.547 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:03.547 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:03.547 22:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:03.806 [2024-07-12 22:34:14.032612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:03.806 [2024-07-12 22:34:14.032657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.806 [2024-07-12 22:34:14.032677] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f93060 00:27:03.806 [2024-07-12 22:34:14.032690] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.806 [2024-07-12 22:34:14.033033] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.806 [2024-07-12 22:34:14.033051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:03.806 [2024-07-12 22:34:14.033113] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:03.806 [2024-07-12 22:34:14.033124] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:03.806 [2024-07-12 22:34:14.033135] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:03.806 BaseBdev1 00:27:03.806 22:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.742 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.001 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.001 "name": "raid_bdev1", 00:27:05.001 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:05.001 "strip_size_kb": 0, 00:27:05.001 "state": "online", 00:27:05.001 "raid_level": "raid1", 00:27:05.001 "superblock": true, 00:27:05.001 "num_base_bdevs": 4, 00:27:05.001 "num_base_bdevs_discovered": 2, 00:27:05.001 "num_base_bdevs_operational": 2, 00:27:05.001 "base_bdevs_list": [ 00:27:05.001 { 00:27:05.001 "name": null, 00:27:05.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.001 "is_configured": false, 00:27:05.001 "data_offset": 2048, 00:27:05.001 "data_size": 63488 00:27:05.001 }, 00:27:05.001 { 00:27:05.001 "name": null, 00:27:05.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.002 "is_configured": false, 00:27:05.002 "data_offset": 2048, 00:27:05.002 "data_size": 63488 00:27:05.002 }, 00:27:05.002 { 00:27:05.002 "name": "BaseBdev3", 00:27:05.002 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:05.002 "is_configured": true, 00:27:05.002 "data_offset": 2048, 00:27:05.002 "data_size": 63488 00:27:05.002 }, 00:27:05.002 { 00:27:05.002 "name": "BaseBdev4", 00:27:05.002 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:05.002 "is_configured": true, 00:27:05.002 "data_offset": 2048, 00:27:05.002 "data_size": 63488 00:27:05.002 } 00:27:05.002 ] 00:27:05.002 }' 00:27:05.002 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.002 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:05.569 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.570 22:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.829 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:05.829 "name": "raid_bdev1", 00:27:05.829 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:05.829 "strip_size_kb": 0, 00:27:05.829 "state": "online", 00:27:05.829 "raid_level": "raid1", 00:27:05.829 "superblock": true, 00:27:05.829 "num_base_bdevs": 4, 00:27:05.829 "num_base_bdevs_discovered": 2, 00:27:05.829 "num_base_bdevs_operational": 2, 00:27:05.829 "base_bdevs_list": [ 00:27:05.829 { 00:27:05.829 "name": null, 00:27:05.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.829 "is_configured": false, 00:27:05.829 "data_offset": 2048, 00:27:05.829 "data_size": 63488 00:27:05.829 }, 00:27:05.829 { 00:27:05.829 "name": null, 00:27:05.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.829 "is_configured": false, 00:27:05.829 "data_offset": 2048, 00:27:05.829 "data_size": 63488 00:27:05.829 }, 00:27:05.829 { 00:27:05.829 "name": "BaseBdev3", 00:27:05.829 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:05.829 "is_configured": true, 00:27:05.829 "data_offset": 2048, 00:27:05.829 "data_size": 63488 00:27:05.829 }, 00:27:05.829 { 00:27:05.829 "name": "BaseBdev4", 00:27:05.829 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:05.829 "is_configured": true, 00:27:05.829 "data_offset": 2048, 00:27:05.829 "data_size": 63488 00:27:05.829 } 00:27:05.829 ] 00:27:05.829 }' 00:27:05.829 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:05.829 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:05.829 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:06.088 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:06.088 [2024-07-12 22:34:16.395184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:06.088 [2024-07-12 22:34:16.395310] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:06.088 [2024-07-12 22:34:16.395325] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:06.088 request: 00:27:06.088 { 00:27:06.088 "base_bdev": "BaseBdev1", 00:27:06.088 "raid_bdev": "raid_bdev1", 00:27:06.088 "method": "bdev_raid_add_base_bdev", 00:27:06.088 "req_id": 1 00:27:06.088 } 00:27:06.088 Got JSON-RPC error response 00:27:06.088 response: 00:27:06.088 { 00:27:06.088 "code": -22, 00:27:06.088 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:06.088 } 00:27:06.347 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:27:06.347 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:06.347 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:06.347 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:06.347 22:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.285 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.544 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:07.544 "name": "raid_bdev1", 00:27:07.544 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:07.544 "strip_size_kb": 0, 00:27:07.544 "state": "online", 00:27:07.544 "raid_level": "raid1", 00:27:07.544 "superblock": true, 00:27:07.544 "num_base_bdevs": 4, 00:27:07.544 "num_base_bdevs_discovered": 2, 00:27:07.544 "num_base_bdevs_operational": 2, 00:27:07.544 "base_bdevs_list": [ 00:27:07.544 { 00:27:07.544 "name": null, 00:27:07.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.544 "is_configured": false, 00:27:07.544 "data_offset": 2048, 00:27:07.544 "data_size": 63488 00:27:07.544 }, 00:27:07.544 { 00:27:07.544 "name": null, 00:27:07.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.544 "is_configured": false, 00:27:07.544 "data_offset": 2048, 00:27:07.544 "data_size": 63488 00:27:07.544 }, 00:27:07.544 { 00:27:07.544 "name": "BaseBdev3", 00:27:07.544 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:07.544 "is_configured": true, 00:27:07.544 "data_offset": 2048, 00:27:07.544 "data_size": 63488 00:27:07.544 }, 00:27:07.544 { 00:27:07.544 "name": "BaseBdev4", 00:27:07.544 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:07.544 "is_configured": true, 00:27:07.544 "data_offset": 2048, 00:27:07.544 "data_size": 63488 00:27:07.544 } 00:27:07.544 ] 00:27:07.544 }' 00:27:07.544 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:07.544 22:34:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.113 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:08.372 "name": "raid_bdev1", 00:27:08.372 "uuid": "6c820926-907e-4e48-8aac-4b9cfe35022f", 00:27:08.372 "strip_size_kb": 0, 00:27:08.372 "state": "online", 00:27:08.372 "raid_level": "raid1", 00:27:08.372 "superblock": true, 00:27:08.372 "num_base_bdevs": 4, 00:27:08.372 "num_base_bdevs_discovered": 2, 00:27:08.372 "num_base_bdevs_operational": 2, 00:27:08.372 "base_bdevs_list": [ 00:27:08.372 { 00:27:08.372 "name": null, 00:27:08.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.372 "is_configured": false, 00:27:08.372 "data_offset": 2048, 00:27:08.372 "data_size": 63488 00:27:08.372 }, 00:27:08.372 { 00:27:08.372 "name": null, 00:27:08.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.372 "is_configured": false, 00:27:08.372 "data_offset": 2048, 00:27:08.372 "data_size": 63488 00:27:08.372 }, 00:27:08.372 { 00:27:08.372 "name": "BaseBdev3", 00:27:08.372 "uuid": "3146b1fb-9750-5ff0-a0d1-0cd80d1ac5c9", 00:27:08.372 "is_configured": true, 00:27:08.372 "data_offset": 2048, 00:27:08.372 "data_size": 63488 00:27:08.372 }, 00:27:08.372 { 00:27:08.372 "name": "BaseBdev4", 00:27:08.372 "uuid": "f9e50edf-148c-5f7d-9227-e7a22d1db5f5", 00:27:08.372 "is_configured": true, 00:27:08.372 "data_offset": 2048, 00:27:08.372 "data_size": 63488 00:27:08.372 } 00:27:08.372 ] 00:27:08.372 }' 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:08.372 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 3554814 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 3554814 ']' 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 3554814 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3554814 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3554814' 00:27:08.373 killing process with pid 3554814 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 3554814 00:27:08.373 Received shutdown signal, test time was about 26.654434 seconds 00:27:08.373 00:27:08.373 Latency(us) 00:27:08.373 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:08.373 =================================================================================================================== 00:27:08.373 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:08.373 [2024-07-12 22:34:18.669101] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:08.373 [2024-07-12 22:34:18.669208] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:08.373 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 3554814 00:27:08.373 [2024-07-12 22:34:18.669270] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:08.373 [2024-07-12 22:34:18.669283] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f97ef0 name raid_bdev1, state offline 00:27:08.632 [2024-07-12 22:34:18.717971] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:08.632 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:08.632 00:27:08.632 real 0m32.130s 00:27:08.632 user 0m50.373s 00:27:08.632 sys 0m5.006s 00:27:08.632 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:08.632 22:34:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:08.632 ************************************ 00:27:08.632 END TEST raid_rebuild_test_sb_io 00:27:08.632 ************************************ 00:27:08.892 22:34:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:08.892 22:34:18 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:27:08.892 22:34:18 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:27:08.892 22:34:18 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:08.892 22:34:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:08.892 22:34:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:08.892 22:34:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:08.892 ************************************ 00:27:08.892 START TEST raid_state_function_test_sb_4k 00:27:08.892 ************************************ 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=3559490 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3559490' 00:27:08.892 Process raid pid: 3559490 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 3559490 /var/tmp/spdk-raid.sock 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 3559490 ']' 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:08.892 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:08.892 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:08.892 [2024-07-12 22:34:19.104407] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:27:08.892 [2024-07-12 22:34:19.104462] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:08.892 [2024-07-12 22:34:19.216605] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:09.151 [2024-07-12 22:34:19.313911] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:09.151 [2024-07-12 22:34:19.377797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:09.151 [2024-07-12 22:34:19.377834] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:09.720 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:09.720 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:09.721 22:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:09.980 [2024-07-12 22:34:20.205082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:09.980 [2024-07-12 22:34:20.205128] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:09.980 [2024-07-12 22:34:20.205139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:09.980 [2024-07-12 22:34:20.205151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.980 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:10.239 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.239 "name": "Existed_Raid", 00:27:10.240 "uuid": "b4338724-8a2e-4934-aad7-808550783df8", 00:27:10.240 "strip_size_kb": 0, 00:27:10.240 "state": "configuring", 00:27:10.240 "raid_level": "raid1", 00:27:10.240 "superblock": true, 00:27:10.240 "num_base_bdevs": 2, 00:27:10.240 "num_base_bdevs_discovered": 0, 00:27:10.240 "num_base_bdevs_operational": 2, 00:27:10.240 "base_bdevs_list": [ 00:27:10.240 { 00:27:10.240 "name": "BaseBdev1", 00:27:10.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.240 "is_configured": false, 00:27:10.240 "data_offset": 0, 00:27:10.240 "data_size": 0 00:27:10.240 }, 00:27:10.240 { 00:27:10.240 "name": "BaseBdev2", 00:27:10.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.240 "is_configured": false, 00:27:10.240 "data_offset": 0, 00:27:10.240 "data_size": 0 00:27:10.240 } 00:27:10.240 ] 00:27:10.240 }' 00:27:10.240 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.240 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.835 22:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:11.099 [2024-07-12 22:34:21.207591] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:11.099 [2024-07-12 22:34:21.207624] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2094a80 name Existed_Raid, state configuring 00:27:11.099 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:11.359 [2024-07-12 22:34:21.456271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:11.359 [2024-07-12 22:34:21.456304] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:11.359 [2024-07-12 22:34:21.456314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:11.359 [2024-07-12 22:34:21.456326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:11.359 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:11.618 [2024-07-12 22:34:21.710786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:11.618 BaseBdev1 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:11.618 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:11.879 22:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:12.138 [ 00:27:12.138 { 00:27:12.138 "name": "BaseBdev1", 00:27:12.138 "aliases": [ 00:27:12.138 "213628eb-48fd-4666-82a0-524dc7a5a72e" 00:27:12.138 ], 00:27:12.138 "product_name": "Malloc disk", 00:27:12.138 "block_size": 4096, 00:27:12.138 "num_blocks": 8192, 00:27:12.138 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:12.138 "assigned_rate_limits": { 00:27:12.138 "rw_ios_per_sec": 0, 00:27:12.138 "rw_mbytes_per_sec": 0, 00:27:12.138 "r_mbytes_per_sec": 0, 00:27:12.138 "w_mbytes_per_sec": 0 00:27:12.138 }, 00:27:12.138 "claimed": true, 00:27:12.138 "claim_type": "exclusive_write", 00:27:12.138 "zoned": false, 00:27:12.138 "supported_io_types": { 00:27:12.138 "read": true, 00:27:12.138 "write": true, 00:27:12.138 "unmap": true, 00:27:12.138 "flush": true, 00:27:12.138 "reset": true, 00:27:12.138 "nvme_admin": false, 00:27:12.138 "nvme_io": false, 00:27:12.138 "nvme_io_md": false, 00:27:12.138 "write_zeroes": true, 00:27:12.138 "zcopy": true, 00:27:12.138 "get_zone_info": false, 00:27:12.138 "zone_management": false, 00:27:12.138 "zone_append": false, 00:27:12.138 "compare": false, 00:27:12.138 "compare_and_write": false, 00:27:12.138 "abort": true, 00:27:12.138 "seek_hole": false, 00:27:12.138 "seek_data": false, 00:27:12.138 "copy": true, 00:27:12.138 "nvme_iov_md": false 00:27:12.138 }, 00:27:12.138 "memory_domains": [ 00:27:12.138 { 00:27:12.138 "dma_device_id": "system", 00:27:12.138 "dma_device_type": 1 00:27:12.138 }, 00:27:12.138 { 00:27:12.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:12.138 "dma_device_type": 2 00:27:12.138 } 00:27:12.138 ], 00:27:12.138 "driver_specific": {} 00:27:12.138 } 00:27:12.138 ] 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.138 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.139 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.139 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.139 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:12.398 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.398 "name": "Existed_Raid", 00:27:12.398 "uuid": "3cff3ad5-dcc1-43f3-8e37-6e2e2d60998f", 00:27:12.398 "strip_size_kb": 0, 00:27:12.398 "state": "configuring", 00:27:12.398 "raid_level": "raid1", 00:27:12.398 "superblock": true, 00:27:12.398 "num_base_bdevs": 2, 00:27:12.398 "num_base_bdevs_discovered": 1, 00:27:12.398 "num_base_bdevs_operational": 2, 00:27:12.398 "base_bdevs_list": [ 00:27:12.398 { 00:27:12.398 "name": "BaseBdev1", 00:27:12.398 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:12.398 "is_configured": true, 00:27:12.398 "data_offset": 256, 00:27:12.398 "data_size": 7936 00:27:12.398 }, 00:27:12.398 { 00:27:12.398 "name": "BaseBdev2", 00:27:12.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.398 "is_configured": false, 00:27:12.398 "data_offset": 0, 00:27:12.398 "data_size": 0 00:27:12.398 } 00:27:12.398 ] 00:27:12.398 }' 00:27:12.398 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.398 22:34:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:12.965 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:12.965 [2024-07-12 22:34:23.234827] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:12.965 [2024-07-12 22:34:23.234870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2094350 name Existed_Raid, state configuring 00:27:12.965 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:13.224 [2024-07-12 22:34:23.483518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:13.224 [2024-07-12 22:34:23.485005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:13.224 [2024-07-12 22:34:23.485035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.224 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:13.483 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.483 "name": "Existed_Raid", 00:27:13.483 "uuid": "0e6dff5c-1f79-414b-ac15-eea9b90c18dd", 00:27:13.483 "strip_size_kb": 0, 00:27:13.483 "state": "configuring", 00:27:13.483 "raid_level": "raid1", 00:27:13.483 "superblock": true, 00:27:13.483 "num_base_bdevs": 2, 00:27:13.483 "num_base_bdevs_discovered": 1, 00:27:13.483 "num_base_bdevs_operational": 2, 00:27:13.483 "base_bdevs_list": [ 00:27:13.483 { 00:27:13.483 "name": "BaseBdev1", 00:27:13.483 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:13.483 "is_configured": true, 00:27:13.483 "data_offset": 256, 00:27:13.483 "data_size": 7936 00:27:13.483 }, 00:27:13.483 { 00:27:13.483 "name": "BaseBdev2", 00:27:13.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.483 "is_configured": false, 00:27:13.483 "data_offset": 0, 00:27:13.483 "data_size": 0 00:27:13.483 } 00:27:13.483 ] 00:27:13.483 }' 00:27:13.483 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.483 22:34:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.051 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:14.333 [2024-07-12 22:34:24.457463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:14.333 [2024-07-12 22:34:24.457618] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2095000 00:27:14.333 [2024-07-12 22:34:24.457632] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:14.333 [2024-07-12 22:34:24.457806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1faf0c0 00:27:14.333 [2024-07-12 22:34:24.457946] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2095000 00:27:14.333 [2024-07-12 22:34:24.457957] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2095000 00:27:14.333 [2024-07-12 22:34:24.458053] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:14.333 BaseBdev2 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:14.333 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:14.592 [ 00:27:14.592 { 00:27:14.592 "name": "BaseBdev2", 00:27:14.592 "aliases": [ 00:27:14.592 "7610aabf-c515-4aae-b69a-1157f2249251" 00:27:14.592 ], 00:27:14.592 "product_name": "Malloc disk", 00:27:14.592 "block_size": 4096, 00:27:14.592 "num_blocks": 8192, 00:27:14.592 "uuid": "7610aabf-c515-4aae-b69a-1157f2249251", 00:27:14.592 "assigned_rate_limits": { 00:27:14.592 "rw_ios_per_sec": 0, 00:27:14.592 "rw_mbytes_per_sec": 0, 00:27:14.592 "r_mbytes_per_sec": 0, 00:27:14.592 "w_mbytes_per_sec": 0 00:27:14.592 }, 00:27:14.592 "claimed": true, 00:27:14.592 "claim_type": "exclusive_write", 00:27:14.592 "zoned": false, 00:27:14.592 "supported_io_types": { 00:27:14.592 "read": true, 00:27:14.592 "write": true, 00:27:14.592 "unmap": true, 00:27:14.592 "flush": true, 00:27:14.592 "reset": true, 00:27:14.592 "nvme_admin": false, 00:27:14.592 "nvme_io": false, 00:27:14.592 "nvme_io_md": false, 00:27:14.592 "write_zeroes": true, 00:27:14.592 "zcopy": true, 00:27:14.592 "get_zone_info": false, 00:27:14.592 "zone_management": false, 00:27:14.592 "zone_append": false, 00:27:14.592 "compare": false, 00:27:14.592 "compare_and_write": false, 00:27:14.592 "abort": true, 00:27:14.592 "seek_hole": false, 00:27:14.592 "seek_data": false, 00:27:14.592 "copy": true, 00:27:14.592 "nvme_iov_md": false 00:27:14.592 }, 00:27:14.592 "memory_domains": [ 00:27:14.592 { 00:27:14.592 "dma_device_id": "system", 00:27:14.592 "dma_device_type": 1 00:27:14.592 }, 00:27:14.592 { 00:27:14.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:14.592 "dma_device_type": 2 00:27:14.592 } 00:27:14.592 ], 00:27:14.592 "driver_specific": {} 00:27:14.592 } 00:27:14.592 ] 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.592 22:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:14.851 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.851 "name": "Existed_Raid", 00:27:14.851 "uuid": "0e6dff5c-1f79-414b-ac15-eea9b90c18dd", 00:27:14.851 "strip_size_kb": 0, 00:27:14.851 "state": "online", 00:27:14.851 "raid_level": "raid1", 00:27:14.851 "superblock": true, 00:27:14.851 "num_base_bdevs": 2, 00:27:14.851 "num_base_bdevs_discovered": 2, 00:27:14.851 "num_base_bdevs_operational": 2, 00:27:14.851 "base_bdevs_list": [ 00:27:14.851 { 00:27:14.851 "name": "BaseBdev1", 00:27:14.851 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:14.851 "is_configured": true, 00:27:14.851 "data_offset": 256, 00:27:14.851 "data_size": 7936 00:27:14.851 }, 00:27:14.851 { 00:27:14.851 "name": "BaseBdev2", 00:27:14.851 "uuid": "7610aabf-c515-4aae-b69a-1157f2249251", 00:27:14.851 "is_configured": true, 00:27:14.851 "data_offset": 256, 00:27:14.852 "data_size": 7936 00:27:14.852 } 00:27:14.852 ] 00:27:14.852 }' 00:27:14.852 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.852 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:15.419 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:15.677 [2024-07-12 22:34:25.881520] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:15.677 "name": "Existed_Raid", 00:27:15.677 "aliases": [ 00:27:15.677 "0e6dff5c-1f79-414b-ac15-eea9b90c18dd" 00:27:15.677 ], 00:27:15.677 "product_name": "Raid Volume", 00:27:15.677 "block_size": 4096, 00:27:15.677 "num_blocks": 7936, 00:27:15.677 "uuid": "0e6dff5c-1f79-414b-ac15-eea9b90c18dd", 00:27:15.677 "assigned_rate_limits": { 00:27:15.677 "rw_ios_per_sec": 0, 00:27:15.677 "rw_mbytes_per_sec": 0, 00:27:15.677 "r_mbytes_per_sec": 0, 00:27:15.677 "w_mbytes_per_sec": 0 00:27:15.677 }, 00:27:15.677 "claimed": false, 00:27:15.677 "zoned": false, 00:27:15.677 "supported_io_types": { 00:27:15.677 "read": true, 00:27:15.677 "write": true, 00:27:15.677 "unmap": false, 00:27:15.677 "flush": false, 00:27:15.677 "reset": true, 00:27:15.677 "nvme_admin": false, 00:27:15.677 "nvme_io": false, 00:27:15.677 "nvme_io_md": false, 00:27:15.677 "write_zeroes": true, 00:27:15.677 "zcopy": false, 00:27:15.677 "get_zone_info": false, 00:27:15.677 "zone_management": false, 00:27:15.677 "zone_append": false, 00:27:15.677 "compare": false, 00:27:15.677 "compare_and_write": false, 00:27:15.677 "abort": false, 00:27:15.677 "seek_hole": false, 00:27:15.677 "seek_data": false, 00:27:15.677 "copy": false, 00:27:15.677 "nvme_iov_md": false 00:27:15.677 }, 00:27:15.677 "memory_domains": [ 00:27:15.677 { 00:27:15.677 "dma_device_id": "system", 00:27:15.677 "dma_device_type": 1 00:27:15.677 }, 00:27:15.677 { 00:27:15.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.677 "dma_device_type": 2 00:27:15.677 }, 00:27:15.677 { 00:27:15.677 "dma_device_id": "system", 00:27:15.677 "dma_device_type": 1 00:27:15.677 }, 00:27:15.677 { 00:27:15.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.677 "dma_device_type": 2 00:27:15.677 } 00:27:15.677 ], 00:27:15.677 "driver_specific": { 00:27:15.677 "raid": { 00:27:15.677 "uuid": "0e6dff5c-1f79-414b-ac15-eea9b90c18dd", 00:27:15.677 "strip_size_kb": 0, 00:27:15.677 "state": "online", 00:27:15.677 "raid_level": "raid1", 00:27:15.677 "superblock": true, 00:27:15.677 "num_base_bdevs": 2, 00:27:15.677 "num_base_bdevs_discovered": 2, 00:27:15.677 "num_base_bdevs_operational": 2, 00:27:15.677 "base_bdevs_list": [ 00:27:15.677 { 00:27:15.677 "name": "BaseBdev1", 00:27:15.677 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:15.677 "is_configured": true, 00:27:15.677 "data_offset": 256, 00:27:15.677 "data_size": 7936 00:27:15.677 }, 00:27:15.677 { 00:27:15.677 "name": "BaseBdev2", 00:27:15.677 "uuid": "7610aabf-c515-4aae-b69a-1157f2249251", 00:27:15.677 "is_configured": true, 00:27:15.677 "data_offset": 256, 00:27:15.677 "data_size": 7936 00:27:15.677 } 00:27:15.677 ] 00:27:15.677 } 00:27:15.677 } 00:27:15.677 }' 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:15.677 BaseBdev2' 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:15.677 22:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:15.936 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:15.936 "name": "BaseBdev1", 00:27:15.936 "aliases": [ 00:27:15.936 "213628eb-48fd-4666-82a0-524dc7a5a72e" 00:27:15.936 ], 00:27:15.936 "product_name": "Malloc disk", 00:27:15.936 "block_size": 4096, 00:27:15.936 "num_blocks": 8192, 00:27:15.936 "uuid": "213628eb-48fd-4666-82a0-524dc7a5a72e", 00:27:15.936 "assigned_rate_limits": { 00:27:15.936 "rw_ios_per_sec": 0, 00:27:15.936 "rw_mbytes_per_sec": 0, 00:27:15.936 "r_mbytes_per_sec": 0, 00:27:15.936 "w_mbytes_per_sec": 0 00:27:15.936 }, 00:27:15.936 "claimed": true, 00:27:15.936 "claim_type": "exclusive_write", 00:27:15.936 "zoned": false, 00:27:15.936 "supported_io_types": { 00:27:15.936 "read": true, 00:27:15.936 "write": true, 00:27:15.936 "unmap": true, 00:27:15.936 "flush": true, 00:27:15.936 "reset": true, 00:27:15.936 "nvme_admin": false, 00:27:15.936 "nvme_io": false, 00:27:15.936 "nvme_io_md": false, 00:27:15.936 "write_zeroes": true, 00:27:15.936 "zcopy": true, 00:27:15.936 "get_zone_info": false, 00:27:15.936 "zone_management": false, 00:27:15.936 "zone_append": false, 00:27:15.936 "compare": false, 00:27:15.936 "compare_and_write": false, 00:27:15.936 "abort": true, 00:27:15.936 "seek_hole": false, 00:27:15.936 "seek_data": false, 00:27:15.936 "copy": true, 00:27:15.936 "nvme_iov_md": false 00:27:15.936 }, 00:27:15.936 "memory_domains": [ 00:27:15.936 { 00:27:15.936 "dma_device_id": "system", 00:27:15.936 "dma_device_type": 1 00:27:15.936 }, 00:27:15.936 { 00:27:15.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.936 "dma_device_type": 2 00:27:15.936 } 00:27:15.936 ], 00:27:15.936 "driver_specific": {} 00:27:15.936 }' 00:27:15.936 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:15.936 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:16.196 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:16.454 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:16.454 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:16.454 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:16.454 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:16.713 "name": "BaseBdev2", 00:27:16.713 "aliases": [ 00:27:16.713 "7610aabf-c515-4aae-b69a-1157f2249251" 00:27:16.713 ], 00:27:16.713 "product_name": "Malloc disk", 00:27:16.713 "block_size": 4096, 00:27:16.713 "num_blocks": 8192, 00:27:16.713 "uuid": "7610aabf-c515-4aae-b69a-1157f2249251", 00:27:16.713 "assigned_rate_limits": { 00:27:16.713 "rw_ios_per_sec": 0, 00:27:16.713 "rw_mbytes_per_sec": 0, 00:27:16.713 "r_mbytes_per_sec": 0, 00:27:16.713 "w_mbytes_per_sec": 0 00:27:16.713 }, 00:27:16.713 "claimed": true, 00:27:16.713 "claim_type": "exclusive_write", 00:27:16.713 "zoned": false, 00:27:16.713 "supported_io_types": { 00:27:16.713 "read": true, 00:27:16.713 "write": true, 00:27:16.713 "unmap": true, 00:27:16.713 "flush": true, 00:27:16.713 "reset": true, 00:27:16.713 "nvme_admin": false, 00:27:16.713 "nvme_io": false, 00:27:16.713 "nvme_io_md": false, 00:27:16.713 "write_zeroes": true, 00:27:16.713 "zcopy": true, 00:27:16.713 "get_zone_info": false, 00:27:16.713 "zone_management": false, 00:27:16.713 "zone_append": false, 00:27:16.713 "compare": false, 00:27:16.713 "compare_and_write": false, 00:27:16.713 "abort": true, 00:27:16.713 "seek_hole": false, 00:27:16.713 "seek_data": false, 00:27:16.713 "copy": true, 00:27:16.713 "nvme_iov_md": false 00:27:16.713 }, 00:27:16.713 "memory_domains": [ 00:27:16.713 { 00:27:16.713 "dma_device_id": "system", 00:27:16.713 "dma_device_type": 1 00:27:16.713 }, 00:27:16.713 { 00:27:16.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:16.713 "dma_device_type": 2 00:27:16.713 } 00:27:16.713 ], 00:27:16.713 "driver_specific": {} 00:27:16.713 }' 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:16.713 22:34:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:16.713 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:16.974 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:16.974 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:16.974 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:16.974 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:16.974 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:17.233 [2024-07-12 22:34:27.369251] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:17.233 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.234 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:17.492 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:17.492 "name": "Existed_Raid", 00:27:17.492 "uuid": "0e6dff5c-1f79-414b-ac15-eea9b90c18dd", 00:27:17.492 "strip_size_kb": 0, 00:27:17.492 "state": "online", 00:27:17.492 "raid_level": "raid1", 00:27:17.492 "superblock": true, 00:27:17.492 "num_base_bdevs": 2, 00:27:17.492 "num_base_bdevs_discovered": 1, 00:27:17.492 "num_base_bdevs_operational": 1, 00:27:17.492 "base_bdevs_list": [ 00:27:17.492 { 00:27:17.492 "name": null, 00:27:17.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.492 "is_configured": false, 00:27:17.492 "data_offset": 256, 00:27:17.492 "data_size": 7936 00:27:17.492 }, 00:27:17.492 { 00:27:17.492 "name": "BaseBdev2", 00:27:17.492 "uuid": "7610aabf-c515-4aae-b69a-1157f2249251", 00:27:17.492 "is_configured": true, 00:27:17.492 "data_offset": 256, 00:27:17.492 "data_size": 7936 00:27:17.492 } 00:27:17.492 ] 00:27:17.492 }' 00:27:17.492 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:17.492 22:34:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:18.059 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:18.059 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:18.059 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.059 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:18.318 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:18.318 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:18.318 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:18.576 [2024-07-12 22:34:28.734801] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:18.576 [2024-07-12 22:34:28.734892] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:18.576 [2024-07-12 22:34:28.745703] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:18.576 [2024-07-12 22:34:28.745737] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:18.576 [2024-07-12 22:34:28.745749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2095000 name Existed_Raid, state offline 00:27:18.576 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:18.576 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:18.576 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.576 22:34:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 3559490 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 3559490 ']' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 3559490 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3559490 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3559490' 00:27:18.835 killing process with pid 3559490 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 3559490 00:27:18.835 [2024-07-12 22:34:29.067199] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:18.835 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 3559490 00:27:18.835 [2024-07-12 22:34:29.068137] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:19.094 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:19.094 00:27:19.094 real 0m10.256s 00:27:19.094 user 0m18.148s 00:27:19.094 sys 0m1.977s 00:27:19.094 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:19.094 22:34:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:19.094 ************************************ 00:27:19.094 END TEST raid_state_function_test_sb_4k 00:27:19.094 ************************************ 00:27:19.094 22:34:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:19.094 22:34:29 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:19.094 22:34:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:19.094 22:34:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:19.094 22:34:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:19.094 ************************************ 00:27:19.094 START TEST raid_superblock_test_4k 00:27:19.094 ************************************ 00:27:19.094 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:19.094 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:19.094 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=3560950 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 3560950 /var/tmp/spdk-raid.sock 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 3560950 ']' 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:19.095 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.095 22:34:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:19.353 [2024-07-12 22:34:29.434665] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:27:19.353 [2024-07-12 22:34:29.434732] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3560950 ] 00:27:19.353 [2024-07-12 22:34:29.565394] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:19.353 [2024-07-12 22:34:29.671420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:19.613 [2024-07-12 22:34:29.741718] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:19.613 [2024-07-12 22:34:29.741757] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:20.181 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:20.439 malloc1 00:27:20.440 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:20.699 [2024-07-12 22:34:30.872765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:20.699 [2024-07-12 22:34:30.872811] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.699 [2024-07-12 22:34:30.872832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176a570 00:27:20.699 [2024-07-12 22:34:30.872845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.699 [2024-07-12 22:34:30.874481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.699 [2024-07-12 22:34:30.874510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:20.699 pt1 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:20.699 22:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:20.958 malloc2 00:27:20.958 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:21.216 [2024-07-12 22:34:31.388174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:21.216 [2024-07-12 22:34:31.388222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.217 [2024-07-12 22:34:31.388240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176b970 00:27:21.217 [2024-07-12 22:34:31.388253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.217 [2024-07-12 22:34:31.389893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.217 [2024-07-12 22:34:31.389920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:21.217 pt2 00:27:21.217 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:21.217 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:21.217 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:21.475 [2024-07-12 22:34:31.632852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:21.475 [2024-07-12 22:34:31.634230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:21.475 [2024-07-12 22:34:31.634385] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x190e270 00:27:21.475 [2024-07-12 22:34:31.634398] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:21.475 [2024-07-12 22:34:31.634602] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17620e0 00:27:21.475 [2024-07-12 22:34:31.634751] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x190e270 00:27:21.475 [2024-07-12 22:34:31.634762] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x190e270 00:27:21.475 [2024-07-12 22:34:31.634865] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.475 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.476 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.476 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.476 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.476 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.784 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.784 "name": "raid_bdev1", 00:27:21.784 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:21.784 "strip_size_kb": 0, 00:27:21.784 "state": "online", 00:27:21.784 "raid_level": "raid1", 00:27:21.784 "superblock": true, 00:27:21.784 "num_base_bdevs": 2, 00:27:21.784 "num_base_bdevs_discovered": 2, 00:27:21.784 "num_base_bdevs_operational": 2, 00:27:21.784 "base_bdevs_list": [ 00:27:21.784 { 00:27:21.784 "name": "pt1", 00:27:21.784 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:21.784 "is_configured": true, 00:27:21.784 "data_offset": 256, 00:27:21.784 "data_size": 7936 00:27:21.784 }, 00:27:21.784 { 00:27:21.784 "name": "pt2", 00:27:21.784 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:21.784 "is_configured": true, 00:27:21.784 "data_offset": 256, 00:27:21.784 "data_size": 7936 00:27:21.784 } 00:27:21.784 ] 00:27:21.784 }' 00:27:21.784 22:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.784 22:34:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.383 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:22.642 [2024-07-12 22:34:32.727974] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.642 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:22.642 "name": "raid_bdev1", 00:27:22.642 "aliases": [ 00:27:22.642 "2b49cc45-65f7-4551-91f3-de71236c81ba" 00:27:22.642 ], 00:27:22.642 "product_name": "Raid Volume", 00:27:22.642 "block_size": 4096, 00:27:22.642 "num_blocks": 7936, 00:27:22.642 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:22.642 "assigned_rate_limits": { 00:27:22.642 "rw_ios_per_sec": 0, 00:27:22.642 "rw_mbytes_per_sec": 0, 00:27:22.642 "r_mbytes_per_sec": 0, 00:27:22.642 "w_mbytes_per_sec": 0 00:27:22.642 }, 00:27:22.642 "claimed": false, 00:27:22.642 "zoned": false, 00:27:22.642 "supported_io_types": { 00:27:22.642 "read": true, 00:27:22.642 "write": true, 00:27:22.642 "unmap": false, 00:27:22.642 "flush": false, 00:27:22.642 "reset": true, 00:27:22.642 "nvme_admin": false, 00:27:22.643 "nvme_io": false, 00:27:22.643 "nvme_io_md": false, 00:27:22.643 "write_zeroes": true, 00:27:22.643 "zcopy": false, 00:27:22.643 "get_zone_info": false, 00:27:22.643 "zone_management": false, 00:27:22.643 "zone_append": false, 00:27:22.643 "compare": false, 00:27:22.643 "compare_and_write": false, 00:27:22.643 "abort": false, 00:27:22.643 "seek_hole": false, 00:27:22.643 "seek_data": false, 00:27:22.643 "copy": false, 00:27:22.643 "nvme_iov_md": false 00:27:22.643 }, 00:27:22.643 "memory_domains": [ 00:27:22.643 { 00:27:22.643 "dma_device_id": "system", 00:27:22.643 "dma_device_type": 1 00:27:22.643 }, 00:27:22.643 { 00:27:22.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.643 "dma_device_type": 2 00:27:22.643 }, 00:27:22.643 { 00:27:22.643 "dma_device_id": "system", 00:27:22.643 "dma_device_type": 1 00:27:22.643 }, 00:27:22.643 { 00:27:22.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.643 "dma_device_type": 2 00:27:22.643 } 00:27:22.643 ], 00:27:22.643 "driver_specific": { 00:27:22.643 "raid": { 00:27:22.643 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:22.643 "strip_size_kb": 0, 00:27:22.643 "state": "online", 00:27:22.643 "raid_level": "raid1", 00:27:22.643 "superblock": true, 00:27:22.643 "num_base_bdevs": 2, 00:27:22.643 "num_base_bdevs_discovered": 2, 00:27:22.643 "num_base_bdevs_operational": 2, 00:27:22.643 "base_bdevs_list": [ 00:27:22.643 { 00:27:22.643 "name": "pt1", 00:27:22.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:22.643 "is_configured": true, 00:27:22.643 "data_offset": 256, 00:27:22.643 "data_size": 7936 00:27:22.643 }, 00:27:22.643 { 00:27:22.643 "name": "pt2", 00:27:22.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:22.643 "is_configured": true, 00:27:22.643 "data_offset": 256, 00:27:22.643 "data_size": 7936 00:27:22.643 } 00:27:22.643 ] 00:27:22.643 } 00:27:22.643 } 00:27:22.643 }' 00:27:22.643 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:22.643 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:22.643 pt2' 00:27:22.643 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:22.643 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:22.643 22:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:22.902 "name": "pt1", 00:27:22.902 "aliases": [ 00:27:22.902 "00000000-0000-0000-0000-000000000001" 00:27:22.902 ], 00:27:22.902 "product_name": "passthru", 00:27:22.902 "block_size": 4096, 00:27:22.902 "num_blocks": 8192, 00:27:22.902 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:22.902 "assigned_rate_limits": { 00:27:22.902 "rw_ios_per_sec": 0, 00:27:22.902 "rw_mbytes_per_sec": 0, 00:27:22.902 "r_mbytes_per_sec": 0, 00:27:22.902 "w_mbytes_per_sec": 0 00:27:22.902 }, 00:27:22.902 "claimed": true, 00:27:22.902 "claim_type": "exclusive_write", 00:27:22.902 "zoned": false, 00:27:22.902 "supported_io_types": { 00:27:22.902 "read": true, 00:27:22.902 "write": true, 00:27:22.902 "unmap": true, 00:27:22.902 "flush": true, 00:27:22.902 "reset": true, 00:27:22.902 "nvme_admin": false, 00:27:22.902 "nvme_io": false, 00:27:22.902 "nvme_io_md": false, 00:27:22.902 "write_zeroes": true, 00:27:22.902 "zcopy": true, 00:27:22.902 "get_zone_info": false, 00:27:22.902 "zone_management": false, 00:27:22.902 "zone_append": false, 00:27:22.902 "compare": false, 00:27:22.902 "compare_and_write": false, 00:27:22.902 "abort": true, 00:27:22.902 "seek_hole": false, 00:27:22.902 "seek_data": false, 00:27:22.902 "copy": true, 00:27:22.902 "nvme_iov_md": false 00:27:22.902 }, 00:27:22.902 "memory_domains": [ 00:27:22.902 { 00:27:22.902 "dma_device_id": "system", 00:27:22.902 "dma_device_type": 1 00:27:22.902 }, 00:27:22.902 { 00:27:22.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.902 "dma_device_type": 2 00:27:22.902 } 00:27:22.902 ], 00:27:22.902 "driver_specific": { 00:27:22.902 "passthru": { 00:27:22.902 "name": "pt1", 00:27:22.902 "base_bdev_name": "malloc1" 00:27:22.902 } 00:27:22.902 } 00:27:22.902 }' 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:22.902 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:23.161 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:23.421 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:23.421 "name": "pt2", 00:27:23.421 "aliases": [ 00:27:23.421 "00000000-0000-0000-0000-000000000002" 00:27:23.421 ], 00:27:23.421 "product_name": "passthru", 00:27:23.421 "block_size": 4096, 00:27:23.421 "num_blocks": 8192, 00:27:23.421 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.421 "assigned_rate_limits": { 00:27:23.421 "rw_ios_per_sec": 0, 00:27:23.421 "rw_mbytes_per_sec": 0, 00:27:23.421 "r_mbytes_per_sec": 0, 00:27:23.421 "w_mbytes_per_sec": 0 00:27:23.421 }, 00:27:23.421 "claimed": true, 00:27:23.421 "claim_type": "exclusive_write", 00:27:23.421 "zoned": false, 00:27:23.421 "supported_io_types": { 00:27:23.421 "read": true, 00:27:23.421 "write": true, 00:27:23.421 "unmap": true, 00:27:23.421 "flush": true, 00:27:23.421 "reset": true, 00:27:23.421 "nvme_admin": false, 00:27:23.421 "nvme_io": false, 00:27:23.421 "nvme_io_md": false, 00:27:23.421 "write_zeroes": true, 00:27:23.421 "zcopy": true, 00:27:23.421 "get_zone_info": false, 00:27:23.421 "zone_management": false, 00:27:23.421 "zone_append": false, 00:27:23.421 "compare": false, 00:27:23.421 "compare_and_write": false, 00:27:23.421 "abort": true, 00:27:23.421 "seek_hole": false, 00:27:23.421 "seek_data": false, 00:27:23.421 "copy": true, 00:27:23.421 "nvme_iov_md": false 00:27:23.421 }, 00:27:23.421 "memory_domains": [ 00:27:23.421 { 00:27:23.421 "dma_device_id": "system", 00:27:23.421 "dma_device_type": 1 00:27:23.421 }, 00:27:23.421 { 00:27:23.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.421 "dma_device_type": 2 00:27:23.421 } 00:27:23.421 ], 00:27:23.421 "driver_specific": { 00:27:23.421 "passthru": { 00:27:23.421 "name": "pt2", 00:27:23.421 "base_bdev_name": "malloc2" 00:27:23.421 } 00:27:23.421 } 00:27:23.421 }' 00:27:23.421 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.421 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:23.421 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:23.421 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.680 22:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:23.939 [2024-07-12 22:34:34.239963] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2b49cc45-65f7-4551-91f3-de71236c81ba 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 2b49cc45-65f7-4551-91f3-de71236c81ba ']' 00:27:23.939 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:24.198 [2024-07-12 22:34:34.484362] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:24.198 [2024-07-12 22:34:34.484386] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:24.198 [2024-07-12 22:34:34.484441] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:24.198 [2024-07-12 22:34:34.484498] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:24.198 [2024-07-12 22:34:34.484510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x190e270 name raid_bdev1, state offline 00:27:24.198 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.198 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:24.456 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:24.456 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:24.456 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:24.456 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:24.715 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:24.715 22:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:24.975 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:24.975 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:25.234 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:25.234 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:25.234 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:25.235 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:25.494 [2024-07-12 22:34:35.707533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:25.494 [2024-07-12 22:34:35.708918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:25.494 [2024-07-12 22:34:35.708984] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:25.494 [2024-07-12 22:34:35.709024] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:25.494 [2024-07-12 22:34:35.709043] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:25.494 [2024-07-12 22:34:35.709053] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x190dff0 name raid_bdev1, state configuring 00:27:25.494 request: 00:27:25.494 { 00:27:25.494 "name": "raid_bdev1", 00:27:25.494 "raid_level": "raid1", 00:27:25.494 "base_bdevs": [ 00:27:25.494 "malloc1", 00:27:25.494 "malloc2" 00:27:25.494 ], 00:27:25.494 "superblock": false, 00:27:25.494 "method": "bdev_raid_create", 00:27:25.494 "req_id": 1 00:27:25.494 } 00:27:25.494 Got JSON-RPC error response 00:27:25.494 response: 00:27:25.494 { 00:27:25.494 "code": -17, 00:27:25.494 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:25.494 } 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.494 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:25.753 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:25.753 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:25.753 22:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:26.011 [2024-07-12 22:34:36.208803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:26.011 [2024-07-12 22:34:36.208859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.011 [2024-07-12 22:34:36.208880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176a7a0 00:27:26.011 [2024-07-12 22:34:36.208899] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.011 [2024-07-12 22:34:36.210486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.011 [2024-07-12 22:34:36.210514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:26.011 [2024-07-12 22:34:36.210580] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:26.012 [2024-07-12 22:34:36.210605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:26.012 pt1 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.012 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.270 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.270 "name": "raid_bdev1", 00:27:26.270 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:26.270 "strip_size_kb": 0, 00:27:26.270 "state": "configuring", 00:27:26.270 "raid_level": "raid1", 00:27:26.270 "superblock": true, 00:27:26.270 "num_base_bdevs": 2, 00:27:26.270 "num_base_bdevs_discovered": 1, 00:27:26.270 "num_base_bdevs_operational": 2, 00:27:26.270 "base_bdevs_list": [ 00:27:26.270 { 00:27:26.270 "name": "pt1", 00:27:26.270 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:26.270 "is_configured": true, 00:27:26.271 "data_offset": 256, 00:27:26.271 "data_size": 7936 00:27:26.271 }, 00:27:26.271 { 00:27:26.271 "name": null, 00:27:26.271 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:26.271 "is_configured": false, 00:27:26.271 "data_offset": 256, 00:27:26.271 "data_size": 7936 00:27:26.271 } 00:27:26.271 ] 00:27:26.271 }' 00:27:26.271 22:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.271 22:34:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:26.838 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:26.838 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:26.838 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:26.838 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:27.097 [2024-07-12 22:34:37.311721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:27.097 [2024-07-12 22:34:37.311773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.097 [2024-07-12 22:34:37.311792] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19026f0 00:27:27.097 [2024-07-12 22:34:37.311805] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.097 [2024-07-12 22:34:37.312204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.097 [2024-07-12 22:34:37.312224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:27.097 [2024-07-12 22:34:37.312287] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:27.097 [2024-07-12 22:34:37.312307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:27.097 [2024-07-12 22:34:37.312409] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1903590 00:27:27.097 [2024-07-12 22:34:37.312420] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:27.097 [2024-07-12 22:34:37.312587] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1764540 00:27:27.097 [2024-07-12 22:34:37.312713] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1903590 00:27:27.097 [2024-07-12 22:34:37.312723] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1903590 00:27:27.097 [2024-07-12 22:34:37.312818] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.097 pt2 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.097 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.356 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.356 "name": "raid_bdev1", 00:27:27.356 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:27.356 "strip_size_kb": 0, 00:27:27.356 "state": "online", 00:27:27.356 "raid_level": "raid1", 00:27:27.356 "superblock": true, 00:27:27.356 "num_base_bdevs": 2, 00:27:27.356 "num_base_bdevs_discovered": 2, 00:27:27.356 "num_base_bdevs_operational": 2, 00:27:27.356 "base_bdevs_list": [ 00:27:27.356 { 00:27:27.356 "name": "pt1", 00:27:27.356 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:27.356 "is_configured": true, 00:27:27.356 "data_offset": 256, 00:27:27.356 "data_size": 7936 00:27:27.356 }, 00:27:27.356 { 00:27:27.356 "name": "pt2", 00:27:27.356 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:27.356 "is_configured": true, 00:27:27.356 "data_offset": 256, 00:27:27.356 "data_size": 7936 00:27:27.356 } 00:27:27.356 ] 00:27:27.356 }' 00:27:27.356 22:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.356 22:34:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:27.923 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:28.181 [2024-07-12 22:34:38.390848] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:28.181 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:28.181 "name": "raid_bdev1", 00:27:28.181 "aliases": [ 00:27:28.181 "2b49cc45-65f7-4551-91f3-de71236c81ba" 00:27:28.181 ], 00:27:28.181 "product_name": "Raid Volume", 00:27:28.181 "block_size": 4096, 00:27:28.181 "num_blocks": 7936, 00:27:28.181 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:28.181 "assigned_rate_limits": { 00:27:28.181 "rw_ios_per_sec": 0, 00:27:28.181 "rw_mbytes_per_sec": 0, 00:27:28.182 "r_mbytes_per_sec": 0, 00:27:28.182 "w_mbytes_per_sec": 0 00:27:28.182 }, 00:27:28.182 "claimed": false, 00:27:28.182 "zoned": false, 00:27:28.182 "supported_io_types": { 00:27:28.182 "read": true, 00:27:28.182 "write": true, 00:27:28.182 "unmap": false, 00:27:28.182 "flush": false, 00:27:28.182 "reset": true, 00:27:28.182 "nvme_admin": false, 00:27:28.182 "nvme_io": false, 00:27:28.182 "nvme_io_md": false, 00:27:28.182 "write_zeroes": true, 00:27:28.182 "zcopy": false, 00:27:28.182 "get_zone_info": false, 00:27:28.182 "zone_management": false, 00:27:28.182 "zone_append": false, 00:27:28.182 "compare": false, 00:27:28.182 "compare_and_write": false, 00:27:28.182 "abort": false, 00:27:28.182 "seek_hole": false, 00:27:28.182 "seek_data": false, 00:27:28.182 "copy": false, 00:27:28.182 "nvme_iov_md": false 00:27:28.182 }, 00:27:28.182 "memory_domains": [ 00:27:28.182 { 00:27:28.182 "dma_device_id": "system", 00:27:28.182 "dma_device_type": 1 00:27:28.182 }, 00:27:28.182 { 00:27:28.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.182 "dma_device_type": 2 00:27:28.182 }, 00:27:28.182 { 00:27:28.182 "dma_device_id": "system", 00:27:28.182 "dma_device_type": 1 00:27:28.182 }, 00:27:28.182 { 00:27:28.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.182 "dma_device_type": 2 00:27:28.182 } 00:27:28.182 ], 00:27:28.182 "driver_specific": { 00:27:28.182 "raid": { 00:27:28.182 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:28.182 "strip_size_kb": 0, 00:27:28.182 "state": "online", 00:27:28.182 "raid_level": "raid1", 00:27:28.182 "superblock": true, 00:27:28.182 "num_base_bdevs": 2, 00:27:28.182 "num_base_bdevs_discovered": 2, 00:27:28.182 "num_base_bdevs_operational": 2, 00:27:28.182 "base_bdevs_list": [ 00:27:28.182 { 00:27:28.182 "name": "pt1", 00:27:28.182 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:28.182 "is_configured": true, 00:27:28.182 "data_offset": 256, 00:27:28.182 "data_size": 7936 00:27:28.182 }, 00:27:28.182 { 00:27:28.182 "name": "pt2", 00:27:28.182 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:28.182 "is_configured": true, 00:27:28.182 "data_offset": 256, 00:27:28.182 "data_size": 7936 00:27:28.182 } 00:27:28.182 ] 00:27:28.182 } 00:27:28.182 } 00:27:28.182 }' 00:27:28.182 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:28.182 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:28.182 pt2' 00:27:28.182 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:28.182 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:28.182 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:28.447 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:28.447 "name": "pt1", 00:27:28.447 "aliases": [ 00:27:28.447 "00000000-0000-0000-0000-000000000001" 00:27:28.447 ], 00:27:28.447 "product_name": "passthru", 00:27:28.447 "block_size": 4096, 00:27:28.447 "num_blocks": 8192, 00:27:28.447 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:28.447 "assigned_rate_limits": { 00:27:28.447 "rw_ios_per_sec": 0, 00:27:28.447 "rw_mbytes_per_sec": 0, 00:27:28.447 "r_mbytes_per_sec": 0, 00:27:28.447 "w_mbytes_per_sec": 0 00:27:28.447 }, 00:27:28.447 "claimed": true, 00:27:28.447 "claim_type": "exclusive_write", 00:27:28.447 "zoned": false, 00:27:28.447 "supported_io_types": { 00:27:28.447 "read": true, 00:27:28.447 "write": true, 00:27:28.447 "unmap": true, 00:27:28.447 "flush": true, 00:27:28.447 "reset": true, 00:27:28.447 "nvme_admin": false, 00:27:28.447 "nvme_io": false, 00:27:28.447 "nvme_io_md": false, 00:27:28.447 "write_zeroes": true, 00:27:28.447 "zcopy": true, 00:27:28.447 "get_zone_info": false, 00:27:28.447 "zone_management": false, 00:27:28.447 "zone_append": false, 00:27:28.447 "compare": false, 00:27:28.447 "compare_and_write": false, 00:27:28.447 "abort": true, 00:27:28.447 "seek_hole": false, 00:27:28.447 "seek_data": false, 00:27:28.447 "copy": true, 00:27:28.447 "nvme_iov_md": false 00:27:28.447 }, 00:27:28.447 "memory_domains": [ 00:27:28.447 { 00:27:28.447 "dma_device_id": "system", 00:27:28.447 "dma_device_type": 1 00:27:28.447 }, 00:27:28.447 { 00:27:28.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.447 "dma_device_type": 2 00:27:28.447 } 00:27:28.447 ], 00:27:28.447 "driver_specific": { 00:27:28.447 "passthru": { 00:27:28.447 "name": "pt1", 00:27:28.447 "base_bdev_name": "malloc1" 00:27:28.447 } 00:27:28.447 } 00:27:28.447 }' 00:27:28.447 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.447 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:28.708 22:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.708 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.967 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:28.967 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:28.967 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:28.967 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:29.226 "name": "pt2", 00:27:29.226 "aliases": [ 00:27:29.226 "00000000-0000-0000-0000-000000000002" 00:27:29.226 ], 00:27:29.226 "product_name": "passthru", 00:27:29.226 "block_size": 4096, 00:27:29.226 "num_blocks": 8192, 00:27:29.226 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:29.226 "assigned_rate_limits": { 00:27:29.226 "rw_ios_per_sec": 0, 00:27:29.226 "rw_mbytes_per_sec": 0, 00:27:29.226 "r_mbytes_per_sec": 0, 00:27:29.226 "w_mbytes_per_sec": 0 00:27:29.226 }, 00:27:29.226 "claimed": true, 00:27:29.226 "claim_type": "exclusive_write", 00:27:29.226 "zoned": false, 00:27:29.226 "supported_io_types": { 00:27:29.226 "read": true, 00:27:29.226 "write": true, 00:27:29.226 "unmap": true, 00:27:29.226 "flush": true, 00:27:29.226 "reset": true, 00:27:29.226 "nvme_admin": false, 00:27:29.226 "nvme_io": false, 00:27:29.226 "nvme_io_md": false, 00:27:29.226 "write_zeroes": true, 00:27:29.226 "zcopy": true, 00:27:29.226 "get_zone_info": false, 00:27:29.226 "zone_management": false, 00:27:29.226 "zone_append": false, 00:27:29.226 "compare": false, 00:27:29.226 "compare_and_write": false, 00:27:29.226 "abort": true, 00:27:29.226 "seek_hole": false, 00:27:29.226 "seek_data": false, 00:27:29.226 "copy": true, 00:27:29.226 "nvme_iov_md": false 00:27:29.226 }, 00:27:29.226 "memory_domains": [ 00:27:29.226 { 00:27:29.226 "dma_device_id": "system", 00:27:29.226 "dma_device_type": 1 00:27:29.226 }, 00:27:29.226 { 00:27:29.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.226 "dma_device_type": 2 00:27:29.226 } 00:27:29.226 ], 00:27:29.226 "driver_specific": { 00:27:29.226 "passthru": { 00:27:29.226 "name": "pt2", 00:27:29.226 "base_bdev_name": "malloc2" 00:27:29.226 } 00:27:29.226 } 00:27:29.226 }' 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:29.226 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.485 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.485 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:29.485 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:29.485 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:29.744 [2024-07-12 22:34:39.814661] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:29.744 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 2b49cc45-65f7-4551-91f3-de71236c81ba '!=' 2b49cc45-65f7-4551-91f3-de71236c81ba ']' 00:27:29.744 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:29.744 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:29.744 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:29.744 22:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:29.744 [2024-07-12 22:34:40.063093] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.003 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.263 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:30.263 "name": "raid_bdev1", 00:27:30.263 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:30.263 "strip_size_kb": 0, 00:27:30.263 "state": "online", 00:27:30.263 "raid_level": "raid1", 00:27:30.263 "superblock": true, 00:27:30.263 "num_base_bdevs": 2, 00:27:30.263 "num_base_bdevs_discovered": 1, 00:27:30.263 "num_base_bdevs_operational": 1, 00:27:30.263 "base_bdevs_list": [ 00:27:30.263 { 00:27:30.263 "name": null, 00:27:30.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.263 "is_configured": false, 00:27:30.263 "data_offset": 256, 00:27:30.263 "data_size": 7936 00:27:30.263 }, 00:27:30.263 { 00:27:30.263 "name": "pt2", 00:27:30.263 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:30.263 "is_configured": true, 00:27:30.263 "data_offset": 256, 00:27:30.263 "data_size": 7936 00:27:30.263 } 00:27:30.263 ] 00:27:30.263 }' 00:27:30.263 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:30.263 22:34:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:30.832 22:34:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:30.832 [2024-07-12 22:34:41.089774] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:30.832 [2024-07-12 22:34:41.089800] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:30.832 [2024-07-12 22:34:41.089854] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:30.832 [2024-07-12 22:34:41.089895] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:30.832 [2024-07-12 22:34:41.089907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1903590 name raid_bdev1, state offline 00:27:30.832 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:30.832 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.091 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:31.091 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:31.091 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:31.091 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:31.091 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:31.351 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:31.610 [2024-07-12 22:34:41.763541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:31.610 [2024-07-12 22:34:41.763592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:31.610 [2024-07-12 22:34:41.763610] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176b160 00:27:31.610 [2024-07-12 22:34:41.763622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:31.610 [2024-07-12 22:34:41.765238] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:31.610 [2024-07-12 22:34:41.765266] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:31.610 [2024-07-12 22:34:41.765333] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:31.610 [2024-07-12 22:34:41.765359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:31.610 [2024-07-12 22:34:41.765443] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1761380 00:27:31.610 [2024-07-12 22:34:41.765453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:31.610 [2024-07-12 22:34:41.765625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1762a80 00:27:31.610 [2024-07-12 22:34:41.765746] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1761380 00:27:31.610 [2024-07-12 22:34:41.765756] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1761380 00:27:31.610 [2024-07-12 22:34:41.765852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.610 pt2 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.610 22:34:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.870 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.870 "name": "raid_bdev1", 00:27:31.870 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:31.870 "strip_size_kb": 0, 00:27:31.870 "state": "online", 00:27:31.870 "raid_level": "raid1", 00:27:31.870 "superblock": true, 00:27:31.870 "num_base_bdevs": 2, 00:27:31.870 "num_base_bdevs_discovered": 1, 00:27:31.870 "num_base_bdevs_operational": 1, 00:27:31.870 "base_bdevs_list": [ 00:27:31.870 { 00:27:31.870 "name": null, 00:27:31.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.870 "is_configured": false, 00:27:31.870 "data_offset": 256, 00:27:31.870 "data_size": 7936 00:27:31.870 }, 00:27:31.870 { 00:27:31.870 "name": "pt2", 00:27:31.870 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:31.870 "is_configured": true, 00:27:31.870 "data_offset": 256, 00:27:31.870 "data_size": 7936 00:27:31.870 } 00:27:31.870 ] 00:27:31.870 }' 00:27:31.870 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.870 22:34:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:32.438 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:32.713 [2024-07-12 22:34:42.802278] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.713 [2024-07-12 22:34:42.802305] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:32.713 [2024-07-12 22:34:42.802358] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.713 [2024-07-12 22:34:42.802403] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.713 [2024-07-12 22:34:42.802415] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1761380 name raid_bdev1, state offline 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:32.713 22:34:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:32.976 [2024-07-12 22:34:43.151178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:32.976 [2024-07-12 22:34:43.151225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.976 [2024-07-12 22:34:43.151243] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x190d520 00:27:32.976 [2024-07-12 22:34:43.151256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.976 [2024-07-12 22:34:43.152841] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.976 [2024-07-12 22:34:43.152868] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:32.976 [2024-07-12 22:34:43.152939] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:32.976 [2024-07-12 22:34:43.152965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:32.976 [2024-07-12 22:34:43.153062] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:32.976 [2024-07-12 22:34:43.153075] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.976 [2024-07-12 22:34:43.153088] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17623f0 name raid_bdev1, state configuring 00:27:32.977 [2024-07-12 22:34:43.153110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:32.977 [2024-07-12 22:34:43.153166] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17642b0 00:27:32.977 [2024-07-12 22:34:43.153177] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:32.977 [2024-07-12 22:34:43.153337] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1761350 00:27:32.977 [2024-07-12 22:34:43.153465] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17642b0 00:27:32.977 [2024-07-12 22:34:43.153475] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17642b0 00:27:32.977 [2024-07-12 22:34:43.153572] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:32.977 pt1 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.977 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.236 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.236 "name": "raid_bdev1", 00:27:33.236 "uuid": "2b49cc45-65f7-4551-91f3-de71236c81ba", 00:27:33.236 "strip_size_kb": 0, 00:27:33.236 "state": "online", 00:27:33.236 "raid_level": "raid1", 00:27:33.236 "superblock": true, 00:27:33.236 "num_base_bdevs": 2, 00:27:33.236 "num_base_bdevs_discovered": 1, 00:27:33.236 "num_base_bdevs_operational": 1, 00:27:33.236 "base_bdevs_list": [ 00:27:33.236 { 00:27:33.236 "name": null, 00:27:33.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.236 "is_configured": false, 00:27:33.236 "data_offset": 256, 00:27:33.236 "data_size": 7936 00:27:33.236 }, 00:27:33.236 { 00:27:33.236 "name": "pt2", 00:27:33.236 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:33.236 "is_configured": true, 00:27:33.236 "data_offset": 256, 00:27:33.236 "data_size": 7936 00:27:33.236 } 00:27:33.236 ] 00:27:33.236 }' 00:27:33.236 22:34:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.236 22:34:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:33.804 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:33.804 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:34.062 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:34.062 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:34.062 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:34.321 [2024-07-12 22:34:44.502983] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 2b49cc45-65f7-4551-91f3-de71236c81ba '!=' 2b49cc45-65f7-4551-91f3-de71236c81ba ']' 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 3560950 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 3560950 ']' 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 3560950 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3560950 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3560950' 00:27:34.321 killing process with pid 3560950 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 3560950 00:27:34.321 [2024-07-12 22:34:44.571632] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:34.321 [2024-07-12 22:34:44.571687] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:34.321 [2024-07-12 22:34:44.571731] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:34.321 [2024-07-12 22:34:44.571744] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17642b0 name raid_bdev1, state offline 00:27:34.321 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 3560950 00:27:34.321 [2024-07-12 22:34:44.590935] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:34.580 22:34:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:34.580 00:27:34.580 real 0m15.445s 00:27:34.580 user 0m27.974s 00:27:34.580 sys 0m2.873s 00:27:34.580 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:34.580 22:34:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:34.580 ************************************ 00:27:34.580 END TEST raid_superblock_test_4k 00:27:34.580 ************************************ 00:27:34.580 22:34:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:34.580 22:34:44 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:34.580 22:34:44 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:34.580 22:34:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:34.580 22:34:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:34.580 22:34:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:34.580 ************************************ 00:27:34.580 START TEST raid_rebuild_test_sb_4k 00:27:34.580 ************************************ 00:27:34.580 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:34.580 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:34.580 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:34.580 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:34.580 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=3563365 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 3563365 /var/tmp/spdk-raid.sock 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 3563365 ']' 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:34.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:34.839 22:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:34.839 [2024-07-12 22:34:44.962967] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:27:34.839 [2024-07-12 22:34:44.963035] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563365 ] 00:27:34.839 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:34.839 Zero copy mechanism will not be used. 00:27:34.839 [2024-07-12 22:34:45.089445] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:35.098 [2024-07-12 22:34:45.197115] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:35.098 [2024-07-12 22:34:45.264744] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.098 [2024-07-12 22:34:45.264783] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.665 22:34:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:35.666 22:34:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:35.666 22:34:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:35.666 22:34:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:35.924 BaseBdev1_malloc 00:27:35.924 22:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:36.183 [2024-07-12 22:34:46.383052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:36.183 [2024-07-12 22:34:46.383101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.183 [2024-07-12 22:34:46.383130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7cd40 00:27:36.183 [2024-07-12 22:34:46.383143] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.183 [2024-07-12 22:34:46.384923] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.183 [2024-07-12 22:34:46.384957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:36.183 BaseBdev1 00:27:36.183 22:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:36.183 22:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:36.442 BaseBdev2_malloc 00:27:36.442 22:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:36.701 [2024-07-12 22:34:46.886475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:36.701 [2024-07-12 22:34:46.886522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.701 [2024-07-12 22:34:46.886550] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7d860 00:27:36.701 [2024-07-12 22:34:46.886563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.701 [2024-07-12 22:34:46.888148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.701 [2024-07-12 22:34:46.888176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:36.701 BaseBdev2 00:27:36.701 22:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:36.960 spare_malloc 00:27:36.960 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:37.219 spare_delay 00:27:37.219 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:37.478 [2024-07-12 22:34:47.617288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:37.478 [2024-07-12 22:34:47.617335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.478 [2024-07-12 22:34:47.617359] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112bec0 00:27:37.478 [2024-07-12 22:34:47.617371] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.478 [2024-07-12 22:34:47.618960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.478 [2024-07-12 22:34:47.618987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:37.478 spare 00:27:37.478 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:37.737 [2024-07-12 22:34:47.861968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:37.737 [2024-07-12 22:34:47.863300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:37.737 [2024-07-12 22:34:47.863468] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x112d070 00:27:37.737 [2024-07-12 22:34:47.863481] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:37.737 [2024-07-12 22:34:47.863673] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126490 00:27:37.737 [2024-07-12 22:34:47.863817] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x112d070 00:27:37.737 [2024-07-12 22:34:47.863827] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x112d070 00:27:37.737 [2024-07-12 22:34:47.863938] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.737 22:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.996 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.996 "name": "raid_bdev1", 00:27:37.996 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:37.996 "strip_size_kb": 0, 00:27:37.996 "state": "online", 00:27:37.996 "raid_level": "raid1", 00:27:37.996 "superblock": true, 00:27:37.996 "num_base_bdevs": 2, 00:27:37.996 "num_base_bdevs_discovered": 2, 00:27:37.996 "num_base_bdevs_operational": 2, 00:27:37.996 "base_bdevs_list": [ 00:27:37.996 { 00:27:37.996 "name": "BaseBdev1", 00:27:37.996 "uuid": "09b4fc0e-864f-5a2f-83c7-ac7f9b66582b", 00:27:37.996 "is_configured": true, 00:27:37.996 "data_offset": 256, 00:27:37.996 "data_size": 7936 00:27:37.996 }, 00:27:37.996 { 00:27:37.996 "name": "BaseBdev2", 00:27:37.996 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:37.996 "is_configured": true, 00:27:37.996 "data_offset": 256, 00:27:37.996 "data_size": 7936 00:27:37.996 } 00:27:37.996 ] 00:27:37.996 }' 00:27:37.996 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.996 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:38.564 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:38.564 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:38.564 [2024-07-12 22:34:48.876933] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:38.823 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:38.823 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.823 22:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:38.823 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:39.081 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:39.081 [2024-07-12 22:34:49.378065] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126490 00:27:39.081 /dev/nbd0 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:39.340 1+0 records in 00:27:39.340 1+0 records out 00:27:39.340 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229415 s, 17.9 MB/s 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:39.340 22:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:39.908 7936+0 records in 00:27:39.908 7936+0 records out 00:27:39.908 32505856 bytes (33 MB, 31 MiB) copied, 0.748772 s, 43.4 MB/s 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:39.908 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:40.166 [2024-07-12 22:34:50.404157] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:40.166 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:40.425 [2024-07-12 22:34:50.632802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.425 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.704 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.704 "name": "raid_bdev1", 00:27:40.704 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:40.704 "strip_size_kb": 0, 00:27:40.704 "state": "online", 00:27:40.704 "raid_level": "raid1", 00:27:40.704 "superblock": true, 00:27:40.704 "num_base_bdevs": 2, 00:27:40.704 "num_base_bdevs_discovered": 1, 00:27:40.704 "num_base_bdevs_operational": 1, 00:27:40.704 "base_bdevs_list": [ 00:27:40.704 { 00:27:40.704 "name": null, 00:27:40.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.704 "is_configured": false, 00:27:40.704 "data_offset": 256, 00:27:40.704 "data_size": 7936 00:27:40.704 }, 00:27:40.704 { 00:27:40.704 "name": "BaseBdev2", 00:27:40.704 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:40.704 "is_configured": true, 00:27:40.704 "data_offset": 256, 00:27:40.704 "data_size": 7936 00:27:40.704 } 00:27:40.704 ] 00:27:40.704 }' 00:27:40.704 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.704 22:34:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:41.320 22:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:41.320 [2024-07-12 22:34:51.635488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:41.320 [2024-07-12 22:34:51.641159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x112cce0 00:27:41.320 [2024-07-12 22:34:51.643421] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:41.580 22:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.517 22:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.083 "name": "raid_bdev1", 00:27:43.083 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:43.083 "strip_size_kb": 0, 00:27:43.083 "state": "online", 00:27:43.083 "raid_level": "raid1", 00:27:43.083 "superblock": true, 00:27:43.083 "num_base_bdevs": 2, 00:27:43.083 "num_base_bdevs_discovered": 2, 00:27:43.083 "num_base_bdevs_operational": 2, 00:27:43.083 "process": { 00:27:43.083 "type": "rebuild", 00:27:43.083 "target": "spare", 00:27:43.083 "progress": { 00:27:43.083 "blocks": 3584, 00:27:43.083 "percent": 45 00:27:43.083 } 00:27:43.083 }, 00:27:43.083 "base_bdevs_list": [ 00:27:43.083 { 00:27:43.083 "name": "spare", 00:27:43.083 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:43.083 "is_configured": true, 00:27:43.083 "data_offset": 256, 00:27:43.083 "data_size": 7936 00:27:43.083 }, 00:27:43.083 { 00:27:43.083 "name": "BaseBdev2", 00:27:43.083 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:43.083 "is_configured": true, 00:27:43.083 "data_offset": 256, 00:27:43.083 "data_size": 7936 00:27:43.083 } 00:27:43.083 ] 00:27:43.083 }' 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:43.083 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:43.342 [2024-07-12 22:34:53.490531] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.342 [2024-07-12 22:34:53.558712] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:43.342 [2024-07-12 22:34:53.558758] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.342 [2024-07-12 22:34:53.558775] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.342 [2024-07-12 22:34:53.558783] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.342 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.601 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.601 "name": "raid_bdev1", 00:27:43.601 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:43.601 "strip_size_kb": 0, 00:27:43.601 "state": "online", 00:27:43.601 "raid_level": "raid1", 00:27:43.601 "superblock": true, 00:27:43.601 "num_base_bdevs": 2, 00:27:43.601 "num_base_bdevs_discovered": 1, 00:27:43.601 "num_base_bdevs_operational": 1, 00:27:43.601 "base_bdevs_list": [ 00:27:43.601 { 00:27:43.601 "name": null, 00:27:43.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.601 "is_configured": false, 00:27:43.601 "data_offset": 256, 00:27:43.601 "data_size": 7936 00:27:43.601 }, 00:27:43.601 { 00:27:43.601 "name": "BaseBdev2", 00:27:43.601 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:43.601 "is_configured": true, 00:27:43.601 "data_offset": 256, 00:27:43.601 "data_size": 7936 00:27:43.601 } 00:27:43.601 ] 00:27:43.601 }' 00:27:43.601 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.601 22:34:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.167 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.425 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.425 "name": "raid_bdev1", 00:27:44.425 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:44.425 "strip_size_kb": 0, 00:27:44.425 "state": "online", 00:27:44.425 "raid_level": "raid1", 00:27:44.425 "superblock": true, 00:27:44.425 "num_base_bdevs": 2, 00:27:44.425 "num_base_bdevs_discovered": 1, 00:27:44.426 "num_base_bdevs_operational": 1, 00:27:44.426 "base_bdevs_list": [ 00:27:44.426 { 00:27:44.426 "name": null, 00:27:44.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.426 "is_configured": false, 00:27:44.426 "data_offset": 256, 00:27:44.426 "data_size": 7936 00:27:44.426 }, 00:27:44.426 { 00:27:44.426 "name": "BaseBdev2", 00:27:44.426 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:44.426 "is_configured": true, 00:27:44.426 "data_offset": 256, 00:27:44.426 "data_size": 7936 00:27:44.426 } 00:27:44.426 ] 00:27:44.426 }' 00:27:44.426 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.426 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:44.426 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.684 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:44.684 22:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:44.684 [2024-07-12 22:34:55.003004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:44.684 [2024-07-12 22:34:55.008650] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x112cce0 00:27:44.942 [2024-07-12 22:34:55.010159] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:44.942 22:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.876 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.135 "name": "raid_bdev1", 00:27:46.135 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:46.135 "strip_size_kb": 0, 00:27:46.135 "state": "online", 00:27:46.135 "raid_level": "raid1", 00:27:46.135 "superblock": true, 00:27:46.135 "num_base_bdevs": 2, 00:27:46.135 "num_base_bdevs_discovered": 2, 00:27:46.135 "num_base_bdevs_operational": 2, 00:27:46.135 "process": { 00:27:46.135 "type": "rebuild", 00:27:46.135 "target": "spare", 00:27:46.135 "progress": { 00:27:46.135 "blocks": 3072, 00:27:46.135 "percent": 38 00:27:46.135 } 00:27:46.135 }, 00:27:46.135 "base_bdevs_list": [ 00:27:46.135 { 00:27:46.135 "name": "spare", 00:27:46.135 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:46.135 "is_configured": true, 00:27:46.135 "data_offset": 256, 00:27:46.135 "data_size": 7936 00:27:46.135 }, 00:27:46.135 { 00:27:46.135 "name": "BaseBdev2", 00:27:46.135 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:46.135 "is_configured": true, 00:27:46.135 "data_offset": 256, 00:27:46.135 "data_size": 7936 00:27:46.135 } 00:27:46.135 ] 00:27:46.135 }' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:46.135 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1000 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.135 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.393 "name": "raid_bdev1", 00:27:46.393 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:46.393 "strip_size_kb": 0, 00:27:46.393 "state": "online", 00:27:46.393 "raid_level": "raid1", 00:27:46.393 "superblock": true, 00:27:46.393 "num_base_bdevs": 2, 00:27:46.393 "num_base_bdevs_discovered": 2, 00:27:46.393 "num_base_bdevs_operational": 2, 00:27:46.393 "process": { 00:27:46.393 "type": "rebuild", 00:27:46.393 "target": "spare", 00:27:46.393 "progress": { 00:27:46.393 "blocks": 3840, 00:27:46.393 "percent": 48 00:27:46.393 } 00:27:46.393 }, 00:27:46.393 "base_bdevs_list": [ 00:27:46.393 { 00:27:46.393 "name": "spare", 00:27:46.393 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:46.393 "is_configured": true, 00:27:46.393 "data_offset": 256, 00:27:46.393 "data_size": 7936 00:27:46.393 }, 00:27:46.393 { 00:27:46.393 "name": "BaseBdev2", 00:27:46.393 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:46.393 "is_configured": true, 00:27:46.393 "data_offset": 256, 00:27:46.393 "data_size": 7936 00:27:46.393 } 00:27:46.393 ] 00:27:46.393 }' 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.393 22:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:47.328 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:47.586 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.586 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.586 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.586 "name": "raid_bdev1", 00:27:47.586 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:47.586 "strip_size_kb": 0, 00:27:47.586 "state": "online", 00:27:47.586 "raid_level": "raid1", 00:27:47.586 "superblock": true, 00:27:47.586 "num_base_bdevs": 2, 00:27:47.586 "num_base_bdevs_discovered": 2, 00:27:47.586 "num_base_bdevs_operational": 2, 00:27:47.586 "process": { 00:27:47.586 "type": "rebuild", 00:27:47.586 "target": "spare", 00:27:47.586 "progress": { 00:27:47.586 "blocks": 7168, 00:27:47.586 "percent": 90 00:27:47.586 } 00:27:47.586 }, 00:27:47.586 "base_bdevs_list": [ 00:27:47.586 { 00:27:47.586 "name": "spare", 00:27:47.586 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:47.586 "is_configured": true, 00:27:47.586 "data_offset": 256, 00:27:47.586 "data_size": 7936 00:27:47.586 }, 00:27:47.586 { 00:27:47.586 "name": "BaseBdev2", 00:27:47.586 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:47.586 "is_configured": true, 00:27:47.586 "data_offset": 256, 00:27:47.586 "data_size": 7936 00:27:47.586 } 00:27:47.586 ] 00:27:47.586 }' 00:27:47.586 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.845 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:47.845 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.845 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:47.845 22:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:47.845 [2024-07-12 22:34:58.134287] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:47.845 [2024-07-12 22:34:58.134349] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:47.845 [2024-07-12 22:34:58.134432] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:48.782 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:48.782 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:48.782 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.783 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:48.783 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:48.783 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.783 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.783 22:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.041 "name": "raid_bdev1", 00:27:49.041 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:49.041 "strip_size_kb": 0, 00:27:49.041 "state": "online", 00:27:49.041 "raid_level": "raid1", 00:27:49.041 "superblock": true, 00:27:49.041 "num_base_bdevs": 2, 00:27:49.041 "num_base_bdevs_discovered": 2, 00:27:49.041 "num_base_bdevs_operational": 2, 00:27:49.041 "base_bdevs_list": [ 00:27:49.041 { 00:27:49.041 "name": "spare", 00:27:49.041 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:49.041 "is_configured": true, 00:27:49.041 "data_offset": 256, 00:27:49.041 "data_size": 7936 00:27:49.041 }, 00:27:49.041 { 00:27:49.041 "name": "BaseBdev2", 00:27:49.041 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:49.041 "is_configured": true, 00:27:49.041 "data_offset": 256, 00:27:49.041 "data_size": 7936 00:27:49.041 } 00:27:49.041 ] 00:27:49.041 }' 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.041 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.300 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.300 "name": "raid_bdev1", 00:27:49.300 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:49.300 "strip_size_kb": 0, 00:27:49.300 "state": "online", 00:27:49.300 "raid_level": "raid1", 00:27:49.300 "superblock": true, 00:27:49.300 "num_base_bdevs": 2, 00:27:49.300 "num_base_bdevs_discovered": 2, 00:27:49.300 "num_base_bdevs_operational": 2, 00:27:49.300 "base_bdevs_list": [ 00:27:49.300 { 00:27:49.300 "name": "spare", 00:27:49.300 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:49.300 "is_configured": true, 00:27:49.300 "data_offset": 256, 00:27:49.300 "data_size": 7936 00:27:49.300 }, 00:27:49.300 { 00:27:49.300 "name": "BaseBdev2", 00:27:49.300 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:49.300 "is_configured": true, 00:27:49.300 "data_offset": 256, 00:27:49.300 "data_size": 7936 00:27:49.300 } 00:27:49.300 ] 00:27:49.300 }' 00:27:49.300 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:49.300 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:49.300 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.559 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.559 "name": "raid_bdev1", 00:27:49.559 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:49.559 "strip_size_kb": 0, 00:27:49.559 "state": "online", 00:27:49.559 "raid_level": "raid1", 00:27:49.560 "superblock": true, 00:27:49.560 "num_base_bdevs": 2, 00:27:49.560 "num_base_bdevs_discovered": 2, 00:27:49.560 "num_base_bdevs_operational": 2, 00:27:49.560 "base_bdevs_list": [ 00:27:49.560 { 00:27:49.560 "name": "spare", 00:27:49.560 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:49.560 "is_configured": true, 00:27:49.560 "data_offset": 256, 00:27:49.560 "data_size": 7936 00:27:49.560 }, 00:27:49.560 { 00:27:49.560 "name": "BaseBdev2", 00:27:49.560 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:49.560 "is_configured": true, 00:27:49.560 "data_offset": 256, 00:27:49.560 "data_size": 7936 00:27:49.560 } 00:27:49.560 ] 00:27:49.560 }' 00:27:49.560 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.560 22:34:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:50.496 22:35:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:50.496 [2024-07-12 22:35:00.745771] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:50.496 [2024-07-12 22:35:00.745800] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:50.496 [2024-07-12 22:35:00.745865] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:50.496 [2024-07-12 22:35:00.745921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:50.496 [2024-07-12 22:35:00.745940] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x112d070 name raid_bdev1, state offline 00:27:50.496 22:35:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:50.496 22:35:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:50.755 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:51.014 /dev/nbd0 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:51.014 1+0 records in 00:27:51.014 1+0 records out 00:27:51.014 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232744 s, 17.6 MB/s 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:51.014 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:51.273 /dev/nbd1 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:51.273 1+0 records in 00:27:51.273 1+0 records out 00:27:51.273 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316556 s, 12.9 MB/s 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:51.273 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:51.532 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:51.791 22:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:52.050 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:52.310 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:52.310 [2024-07-12 22:35:02.631180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:52.310 [2024-07-12 22:35:02.631226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:52.310 [2024-07-12 22:35:02.631249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x112c500 00:27:52.310 [2024-07-12 22:35:02.631262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:52.310 [2024-07-12 22:35:02.632893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:52.310 [2024-07-12 22:35:02.632921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:52.310 [2024-07-12 22:35:02.633008] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:52.310 [2024-07-12 22:35:02.633040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:52.310 [2024-07-12 22:35:02.633143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:52.569 spare 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.569 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.569 [2024-07-12 22:35:02.733456] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x112d7b0 00:27:52.569 [2024-07-12 22:35:02.733474] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:52.569 [2024-07-12 22:35:02.733666] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126490 00:27:52.569 [2024-07-12 22:35:02.733814] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x112d7b0 00:27:52.569 [2024-07-12 22:35:02.733824] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x112d7b0 00:27:52.569 [2024-07-12 22:35:02.733940] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.827 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.827 "name": "raid_bdev1", 00:27:52.827 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:52.827 "strip_size_kb": 0, 00:27:52.827 "state": "online", 00:27:52.827 "raid_level": "raid1", 00:27:52.827 "superblock": true, 00:27:52.827 "num_base_bdevs": 2, 00:27:52.827 "num_base_bdevs_discovered": 2, 00:27:52.827 "num_base_bdevs_operational": 2, 00:27:52.827 "base_bdevs_list": [ 00:27:52.827 { 00:27:52.827 "name": "spare", 00:27:52.827 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:52.827 "is_configured": true, 00:27:52.827 "data_offset": 256, 00:27:52.827 "data_size": 7936 00:27:52.827 }, 00:27:52.827 { 00:27:52.827 "name": "BaseBdev2", 00:27:52.827 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:52.827 "is_configured": true, 00:27:52.827 "data_offset": 256, 00:27:52.827 "data_size": 7936 00:27:52.827 } 00:27:52.827 ] 00:27:52.827 }' 00:27:52.827 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.827 22:35:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.395 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:53.654 "name": "raid_bdev1", 00:27:53.654 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:53.654 "strip_size_kb": 0, 00:27:53.654 "state": "online", 00:27:53.654 "raid_level": "raid1", 00:27:53.654 "superblock": true, 00:27:53.654 "num_base_bdevs": 2, 00:27:53.654 "num_base_bdevs_discovered": 2, 00:27:53.654 "num_base_bdevs_operational": 2, 00:27:53.654 "base_bdevs_list": [ 00:27:53.654 { 00:27:53.654 "name": "spare", 00:27:53.654 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:53.654 "is_configured": true, 00:27:53.654 "data_offset": 256, 00:27:53.654 "data_size": 7936 00:27:53.654 }, 00:27:53.654 { 00:27:53.654 "name": "BaseBdev2", 00:27:53.654 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:53.654 "is_configured": true, 00:27:53.654 "data_offset": 256, 00:27:53.654 "data_size": 7936 00:27:53.654 } 00:27:53.654 ] 00:27:53.654 }' 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.654 22:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:53.913 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:53.913 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:54.171 [2024-07-12 22:35:04.299709] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.171 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.430 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.430 "name": "raid_bdev1", 00:27:54.430 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:54.430 "strip_size_kb": 0, 00:27:54.430 "state": "online", 00:27:54.430 "raid_level": "raid1", 00:27:54.430 "superblock": true, 00:27:54.430 "num_base_bdevs": 2, 00:27:54.430 "num_base_bdevs_discovered": 1, 00:27:54.430 "num_base_bdevs_operational": 1, 00:27:54.430 "base_bdevs_list": [ 00:27:54.430 { 00:27:54.430 "name": null, 00:27:54.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.430 "is_configured": false, 00:27:54.430 "data_offset": 256, 00:27:54.430 "data_size": 7936 00:27:54.430 }, 00:27:54.430 { 00:27:54.430 "name": "BaseBdev2", 00:27:54.430 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:54.430 "is_configured": true, 00:27:54.430 "data_offset": 256, 00:27:54.430 "data_size": 7936 00:27:54.430 } 00:27:54.430 ] 00:27:54.430 }' 00:27:54.430 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.430 22:35:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:54.998 22:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:55.257 [2024-07-12 22:35:05.418701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.257 [2024-07-12 22:35:05.418860] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:55.257 [2024-07-12 22:35:05.418877] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:55.257 [2024-07-12 22:35:05.418906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.257 [2024-07-12 22:35:05.424431] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1126490 00:27:55.257 [2024-07-12 22:35:05.426828] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:55.257 22:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.194 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.453 "name": "raid_bdev1", 00:27:56.453 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:56.453 "strip_size_kb": 0, 00:27:56.453 "state": "online", 00:27:56.453 "raid_level": "raid1", 00:27:56.453 "superblock": true, 00:27:56.453 "num_base_bdevs": 2, 00:27:56.453 "num_base_bdevs_discovered": 2, 00:27:56.453 "num_base_bdevs_operational": 2, 00:27:56.453 "process": { 00:27:56.453 "type": "rebuild", 00:27:56.453 "target": "spare", 00:27:56.453 "progress": { 00:27:56.453 "blocks": 3072, 00:27:56.453 "percent": 38 00:27:56.453 } 00:27:56.453 }, 00:27:56.453 "base_bdevs_list": [ 00:27:56.453 { 00:27:56.453 "name": "spare", 00:27:56.453 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:56.453 "is_configured": true, 00:27:56.453 "data_offset": 256, 00:27:56.453 "data_size": 7936 00:27:56.453 }, 00:27:56.453 { 00:27:56.453 "name": "BaseBdev2", 00:27:56.453 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:56.453 "is_configured": true, 00:27:56.453 "data_offset": 256, 00:27:56.453 "data_size": 7936 00:27:56.453 } 00:27:56.453 ] 00:27:56.453 }' 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:56.453 22:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:56.712 [2024-07-12 22:35:06.984954] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:56.971 [2024-07-12 22:35:07.039393] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:56.972 [2024-07-12 22:35:07.039438] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:56.972 [2024-07-12 22:35:07.039453] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:56.972 [2024-07-12 22:35:07.039461] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.972 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.231 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.231 "name": "raid_bdev1", 00:27:57.231 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:57.231 "strip_size_kb": 0, 00:27:57.231 "state": "online", 00:27:57.231 "raid_level": "raid1", 00:27:57.231 "superblock": true, 00:27:57.231 "num_base_bdevs": 2, 00:27:57.231 "num_base_bdevs_discovered": 1, 00:27:57.231 "num_base_bdevs_operational": 1, 00:27:57.231 "base_bdevs_list": [ 00:27:57.231 { 00:27:57.231 "name": null, 00:27:57.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.231 "is_configured": false, 00:27:57.231 "data_offset": 256, 00:27:57.231 "data_size": 7936 00:27:57.231 }, 00:27:57.231 { 00:27:57.231 "name": "BaseBdev2", 00:27:57.231 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:57.231 "is_configured": true, 00:27:57.231 "data_offset": 256, 00:27:57.231 "data_size": 7936 00:27:57.231 } 00:27:57.231 ] 00:27:57.231 }' 00:27:57.231 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.231 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:57.799 22:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:57.799 [2024-07-12 22:35:08.043131] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:57.799 [2024-07-12 22:35:08.043182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.799 [2024-07-12 22:35:08.043207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1129ad0 00:27:57.799 [2024-07-12 22:35:08.043220] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.799 [2024-07-12 22:35:08.043589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.799 [2024-07-12 22:35:08.043607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:57.799 [2024-07-12 22:35:08.043684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:57.799 [2024-07-12 22:35:08.043697] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:57.799 [2024-07-12 22:35:08.043708] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:57.799 [2024-07-12 22:35:08.043726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:57.799 [2024-07-12 22:35:08.048541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1127a60 00:27:57.799 spare 00:27:57.799 [2024-07-12 22:35:08.050004] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:57.799 22:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.177 "name": "raid_bdev1", 00:27:59.177 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:59.177 "strip_size_kb": 0, 00:27:59.177 "state": "online", 00:27:59.177 "raid_level": "raid1", 00:27:59.177 "superblock": true, 00:27:59.177 "num_base_bdevs": 2, 00:27:59.177 "num_base_bdevs_discovered": 2, 00:27:59.177 "num_base_bdevs_operational": 2, 00:27:59.177 "process": { 00:27:59.177 "type": "rebuild", 00:27:59.177 "target": "spare", 00:27:59.177 "progress": { 00:27:59.177 "blocks": 3072, 00:27:59.177 "percent": 38 00:27:59.177 } 00:27:59.177 }, 00:27:59.177 "base_bdevs_list": [ 00:27:59.177 { 00:27:59.177 "name": "spare", 00:27:59.177 "uuid": "cb948ea8-b0c6-5ffd-a7fa-fa8260de72d1", 00:27:59.177 "is_configured": true, 00:27:59.177 "data_offset": 256, 00:27:59.177 "data_size": 7936 00:27:59.177 }, 00:27:59.177 { 00:27:59.177 "name": "BaseBdev2", 00:27:59.177 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:59.177 "is_configured": true, 00:27:59.177 "data_offset": 256, 00:27:59.177 "data_size": 7936 00:27:59.177 } 00:27:59.177 ] 00:27:59.177 }' 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:59.177 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:59.436 [2024-07-12 22:35:09.625010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:59.436 [2024-07-12 22:35:09.662561] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:59.436 [2024-07-12 22:35:09.662606] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:59.436 [2024-07-12 22:35:09.662621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:59.436 [2024-07-12 22:35:09.662630] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.436 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.695 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.695 "name": "raid_bdev1", 00:27:59.695 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:27:59.695 "strip_size_kb": 0, 00:27:59.695 "state": "online", 00:27:59.695 "raid_level": "raid1", 00:27:59.695 "superblock": true, 00:27:59.695 "num_base_bdevs": 2, 00:27:59.695 "num_base_bdevs_discovered": 1, 00:27:59.695 "num_base_bdevs_operational": 1, 00:27:59.695 "base_bdevs_list": [ 00:27:59.695 { 00:27:59.695 "name": null, 00:27:59.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.695 "is_configured": false, 00:27:59.695 "data_offset": 256, 00:27:59.695 "data_size": 7936 00:27:59.695 }, 00:27:59.695 { 00:27:59.695 "name": "BaseBdev2", 00:27:59.695 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:27:59.695 "is_configured": true, 00:27:59.695 "data_offset": 256, 00:27:59.695 "data_size": 7936 00:27:59.695 } 00:27:59.695 ] 00:27:59.695 }' 00:27:59.695 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.695 22:35:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.263 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.521 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.521 "name": "raid_bdev1", 00:28:00.521 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:28:00.521 "strip_size_kb": 0, 00:28:00.522 "state": "online", 00:28:00.522 "raid_level": "raid1", 00:28:00.522 "superblock": true, 00:28:00.522 "num_base_bdevs": 2, 00:28:00.522 "num_base_bdevs_discovered": 1, 00:28:00.522 "num_base_bdevs_operational": 1, 00:28:00.522 "base_bdevs_list": [ 00:28:00.522 { 00:28:00.522 "name": null, 00:28:00.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.522 "is_configured": false, 00:28:00.522 "data_offset": 256, 00:28:00.522 "data_size": 7936 00:28:00.522 }, 00:28:00.522 { 00:28:00.522 "name": "BaseBdev2", 00:28:00.522 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:28:00.522 "is_configured": true, 00:28:00.522 "data_offset": 256, 00:28:00.522 "data_size": 7936 00:28:00.522 } 00:28:00.522 ] 00:28:00.522 }' 00:28:00.522 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.522 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:00.522 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.783 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:00.783 22:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:00.783 22:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:01.457 [2024-07-12 22:35:11.508582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:01.457 [2024-07-12 22:35:11.508634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.457 [2024-07-12 22:35:11.508661] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1098980 00:28:01.457 [2024-07-12 22:35:11.508673] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.457 [2024-07-12 22:35:11.509045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.457 [2024-07-12 22:35:11.509063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:01.457 [2024-07-12 22:35:11.509129] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:01.457 [2024-07-12 22:35:11.509142] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:01.457 [2024-07-12 22:35:11.509153] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:01.457 BaseBdev1 00:28:01.457 22:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.395 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.654 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.654 "name": "raid_bdev1", 00:28:02.654 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:28:02.654 "strip_size_kb": 0, 00:28:02.654 "state": "online", 00:28:02.654 "raid_level": "raid1", 00:28:02.654 "superblock": true, 00:28:02.654 "num_base_bdevs": 2, 00:28:02.654 "num_base_bdevs_discovered": 1, 00:28:02.654 "num_base_bdevs_operational": 1, 00:28:02.654 "base_bdevs_list": [ 00:28:02.654 { 00:28:02.654 "name": null, 00:28:02.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:02.654 "is_configured": false, 00:28:02.654 "data_offset": 256, 00:28:02.654 "data_size": 7936 00:28:02.654 }, 00:28:02.654 { 00:28:02.655 "name": "BaseBdev2", 00:28:02.655 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:28:02.655 "is_configured": true, 00:28:02.655 "data_offset": 256, 00:28:02.655 "data_size": 7936 00:28:02.655 } 00:28:02.655 ] 00:28:02.655 }' 00:28:02.655 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.655 22:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.222 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:03.481 "name": "raid_bdev1", 00:28:03.481 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:28:03.481 "strip_size_kb": 0, 00:28:03.481 "state": "online", 00:28:03.481 "raid_level": "raid1", 00:28:03.481 "superblock": true, 00:28:03.481 "num_base_bdevs": 2, 00:28:03.481 "num_base_bdevs_discovered": 1, 00:28:03.481 "num_base_bdevs_operational": 1, 00:28:03.481 "base_bdevs_list": [ 00:28:03.481 { 00:28:03.481 "name": null, 00:28:03.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.481 "is_configured": false, 00:28:03.481 "data_offset": 256, 00:28:03.481 "data_size": 7936 00:28:03.481 }, 00:28:03.481 { 00:28:03.481 "name": "BaseBdev2", 00:28:03.481 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:28:03.481 "is_configured": true, 00:28:03.481 "data_offset": 256, 00:28:03.481 "data_size": 7936 00:28:03.481 } 00:28:03.481 ] 00:28:03.481 }' 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:03.481 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:03.482 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:03.740 [2024-07-12 22:35:13.959096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:03.740 [2024-07-12 22:35:13.959221] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:03.740 [2024-07-12 22:35:13.959237] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:03.740 request: 00:28:03.741 { 00:28:03.741 "base_bdev": "BaseBdev1", 00:28:03.741 "raid_bdev": "raid_bdev1", 00:28:03.741 "method": "bdev_raid_add_base_bdev", 00:28:03.741 "req_id": 1 00:28:03.741 } 00:28:03.741 Got JSON-RPC error response 00:28:03.741 response: 00:28:03.741 { 00:28:03.741 "code": -22, 00:28:03.741 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:03.741 } 00:28:03.741 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:28:03.741 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:03.741 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:03.741 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:03.741 22:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:04.678 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:04.679 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:04.679 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:04.679 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:04.679 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.679 22:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.938 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:04.938 "name": "raid_bdev1", 00:28:04.938 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:28:04.938 "strip_size_kb": 0, 00:28:04.938 "state": "online", 00:28:04.938 "raid_level": "raid1", 00:28:04.938 "superblock": true, 00:28:04.938 "num_base_bdevs": 2, 00:28:04.938 "num_base_bdevs_discovered": 1, 00:28:04.938 "num_base_bdevs_operational": 1, 00:28:04.938 "base_bdevs_list": [ 00:28:04.938 { 00:28:04.938 "name": null, 00:28:04.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.938 "is_configured": false, 00:28:04.938 "data_offset": 256, 00:28:04.938 "data_size": 7936 00:28:04.938 }, 00:28:04.938 { 00:28:04.938 "name": "BaseBdev2", 00:28:04.938 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:28:04.938 "is_configured": true, 00:28:04.938 "data_offset": 256, 00:28:04.938 "data_size": 7936 00:28:04.938 } 00:28:04.938 ] 00:28:04.938 }' 00:28:04.938 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:04.938 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.875 22:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.875 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:05.875 "name": "raid_bdev1", 00:28:05.875 "uuid": "6cdcd856-b465-4b09-9c0f-c9ad3b66d051", 00:28:05.875 "strip_size_kb": 0, 00:28:05.875 "state": "online", 00:28:05.875 "raid_level": "raid1", 00:28:05.875 "superblock": true, 00:28:05.875 "num_base_bdevs": 2, 00:28:05.875 "num_base_bdevs_discovered": 1, 00:28:05.875 "num_base_bdevs_operational": 1, 00:28:05.876 "base_bdevs_list": [ 00:28:05.876 { 00:28:05.876 "name": null, 00:28:05.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:05.876 "is_configured": false, 00:28:05.876 "data_offset": 256, 00:28:05.876 "data_size": 7936 00:28:05.876 }, 00:28:05.876 { 00:28:05.876 "name": "BaseBdev2", 00:28:05.876 "uuid": "52ff96d5-65f2-5b9e-a070-fdc130f22ded", 00:28:05.876 "is_configured": true, 00:28:05.876 "data_offset": 256, 00:28:05.876 "data_size": 7936 00:28:05.876 } 00:28:05.876 ] 00:28:05.876 }' 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 3563365 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 3563365 ']' 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 3563365 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:05.876 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3563365 00:28:06.135 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:06.135 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:06.135 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3563365' 00:28:06.135 killing process with pid 3563365 00:28:06.135 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 3563365 00:28:06.135 Received shutdown signal, test time was about 60.000000 seconds 00:28:06.135 00:28:06.135 Latency(us) 00:28:06.135 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:06.135 =================================================================================================================== 00:28:06.135 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:06.135 [2024-07-12 22:35:16.231620] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:06.135 [2024-07-12 22:35:16.231713] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:06.135 [2024-07-12 22:35:16.231762] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:06.135 [2024-07-12 22:35:16.231775] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x112d7b0 name raid_bdev1, state offline 00:28:06.135 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 3563365 00:28:06.135 [2024-07-12 22:35:16.262315] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:06.395 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:28:06.395 00:28:06.395 real 0m31.579s 00:28:06.395 user 0m49.197s 00:28:06.395 sys 0m5.143s 00:28:06.395 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:06.395 22:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:06.395 ************************************ 00:28:06.395 END TEST raid_rebuild_test_sb_4k 00:28:06.395 ************************************ 00:28:06.395 22:35:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:06.395 22:35:16 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:28:06.395 22:35:16 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:28:06.395 22:35:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:06.395 22:35:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:06.395 22:35:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:06.395 ************************************ 00:28:06.395 START TEST raid_state_function_test_sb_md_separate 00:28:06.395 ************************************ 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=3567867 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3567867' 00:28:06.395 Process raid pid: 3567867 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 3567867 /var/tmp/spdk-raid.sock 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3567867 ']' 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:06.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:06.395 22:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:06.395 [2024-07-12 22:35:16.650156] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:28:06.395 [2024-07-12 22:35:16.650226] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:06.654 [2024-07-12 22:35:16.781339] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:06.654 [2024-07-12 22:35:16.880380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:06.654 [2024-07-12 22:35:16.943187] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:06.654 [2024-07-12 22:35:16.943224] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:07.589 [2024-07-12 22:35:17.733508] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:07.589 [2024-07-12 22:35:17.733553] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:07.589 [2024-07-12 22:35:17.733564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:07.589 [2024-07-12 22:35:17.733576] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.589 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:07.849 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.849 "name": "Existed_Raid", 00:28:07.849 "uuid": "236c314c-72a2-4c88-a1e4-68671da8db16", 00:28:07.849 "strip_size_kb": 0, 00:28:07.849 "state": "configuring", 00:28:07.849 "raid_level": "raid1", 00:28:07.849 "superblock": true, 00:28:07.849 "num_base_bdevs": 2, 00:28:07.849 "num_base_bdevs_discovered": 0, 00:28:07.849 "num_base_bdevs_operational": 2, 00:28:07.849 "base_bdevs_list": [ 00:28:07.849 { 00:28:07.849 "name": "BaseBdev1", 00:28:07.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.849 "is_configured": false, 00:28:07.849 "data_offset": 0, 00:28:07.849 "data_size": 0 00:28:07.849 }, 00:28:07.849 { 00:28:07.849 "name": "BaseBdev2", 00:28:07.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.849 "is_configured": false, 00:28:07.849 "data_offset": 0, 00:28:07.849 "data_size": 0 00:28:07.849 } 00:28:07.849 ] 00:28:07.849 }' 00:28:07.849 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.849 22:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:08.416 22:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:08.416 [2024-07-12 22:35:18.695899] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:08.416 [2024-07-12 22:35:18.695939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa2a80 name Existed_Raid, state configuring 00:28:08.416 22:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:08.674 [2024-07-12 22:35:18.884427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:08.674 [2024-07-12 22:35:18.884455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:08.674 [2024-07-12 22:35:18.884464] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:08.674 [2024-07-12 22:35:18.884476] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:08.674 22:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:08.933 [2024-07-12 22:35:19.067374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:08.933 BaseBdev1 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:08.933 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:09.191 [ 00:28:09.191 { 00:28:09.191 "name": "BaseBdev1", 00:28:09.191 "aliases": [ 00:28:09.191 "27d5a131-0e9f-46d2-93e4-7da95cdcfced" 00:28:09.191 ], 00:28:09.191 "product_name": "Malloc disk", 00:28:09.191 "block_size": 4096, 00:28:09.191 "num_blocks": 8192, 00:28:09.191 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:09.191 "md_size": 32, 00:28:09.191 "md_interleave": false, 00:28:09.191 "dif_type": 0, 00:28:09.191 "assigned_rate_limits": { 00:28:09.191 "rw_ios_per_sec": 0, 00:28:09.191 "rw_mbytes_per_sec": 0, 00:28:09.191 "r_mbytes_per_sec": 0, 00:28:09.191 "w_mbytes_per_sec": 0 00:28:09.191 }, 00:28:09.191 "claimed": true, 00:28:09.191 "claim_type": "exclusive_write", 00:28:09.191 "zoned": false, 00:28:09.191 "supported_io_types": { 00:28:09.191 "read": true, 00:28:09.191 "write": true, 00:28:09.191 "unmap": true, 00:28:09.191 "flush": true, 00:28:09.191 "reset": true, 00:28:09.191 "nvme_admin": false, 00:28:09.191 "nvme_io": false, 00:28:09.191 "nvme_io_md": false, 00:28:09.191 "write_zeroes": true, 00:28:09.191 "zcopy": true, 00:28:09.191 "get_zone_info": false, 00:28:09.191 "zone_management": false, 00:28:09.191 "zone_append": false, 00:28:09.191 "compare": false, 00:28:09.191 "compare_and_write": false, 00:28:09.191 "abort": true, 00:28:09.191 "seek_hole": false, 00:28:09.191 "seek_data": false, 00:28:09.191 "copy": true, 00:28:09.191 "nvme_iov_md": false 00:28:09.191 }, 00:28:09.191 "memory_domains": [ 00:28:09.191 { 00:28:09.191 "dma_device_id": "system", 00:28:09.191 "dma_device_type": 1 00:28:09.191 }, 00:28:09.191 { 00:28:09.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.191 "dma_device_type": 2 00:28:09.191 } 00:28:09.191 ], 00:28:09.191 "driver_specific": {} 00:28:09.191 } 00:28:09.191 ] 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.191 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:09.451 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:09.451 "name": "Existed_Raid", 00:28:09.451 "uuid": "d8567583-e28c-449f-8903-19b0f66a9f80", 00:28:09.451 "strip_size_kb": 0, 00:28:09.451 "state": "configuring", 00:28:09.451 "raid_level": "raid1", 00:28:09.451 "superblock": true, 00:28:09.451 "num_base_bdevs": 2, 00:28:09.451 "num_base_bdevs_discovered": 1, 00:28:09.451 "num_base_bdevs_operational": 2, 00:28:09.451 "base_bdevs_list": [ 00:28:09.451 { 00:28:09.451 "name": "BaseBdev1", 00:28:09.451 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:09.451 "is_configured": true, 00:28:09.451 "data_offset": 256, 00:28:09.451 "data_size": 7936 00:28:09.451 }, 00:28:09.451 { 00:28:09.451 "name": "BaseBdev2", 00:28:09.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.451 "is_configured": false, 00:28:09.451 "data_offset": 0, 00:28:09.451 "data_size": 0 00:28:09.451 } 00:28:09.451 ] 00:28:09.451 }' 00:28:09.451 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:09.451 22:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:10.018 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:10.277 [2024-07-12 22:35:20.418990] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:10.277 [2024-07-12 22:35:20.419035] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa2350 name Existed_Raid, state configuring 00:28:10.277 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:10.844 [2024-07-12 22:35:20.924355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:10.844 [2024-07-12 22:35:20.925795] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:10.844 [2024-07-12 22:35:20.925831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.844 22:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:11.102 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.102 "name": "Existed_Raid", 00:28:11.102 "uuid": "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2", 00:28:11.102 "strip_size_kb": 0, 00:28:11.102 "state": "configuring", 00:28:11.102 "raid_level": "raid1", 00:28:11.102 "superblock": true, 00:28:11.102 "num_base_bdevs": 2, 00:28:11.102 "num_base_bdevs_discovered": 1, 00:28:11.102 "num_base_bdevs_operational": 2, 00:28:11.102 "base_bdevs_list": [ 00:28:11.102 { 00:28:11.102 "name": "BaseBdev1", 00:28:11.102 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:11.102 "is_configured": true, 00:28:11.102 "data_offset": 256, 00:28:11.102 "data_size": 7936 00:28:11.102 }, 00:28:11.102 { 00:28:11.102 "name": "BaseBdev2", 00:28:11.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.102 "is_configured": false, 00:28:11.102 "data_offset": 0, 00:28:11.102 "data_size": 0 00:28:11.102 } 00:28:11.102 ] 00:28:11.102 }' 00:28:11.102 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.102 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:11.669 [2024-07-12 22:35:21.975211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:11.669 [2024-07-12 22:35:21.975355] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa4210 00:28:11.669 [2024-07-12 22:35:21.975369] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:11.669 [2024-07-12 22:35:21.975433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa3c50 00:28:11.669 [2024-07-12 22:35:21.975531] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa4210 00:28:11.669 [2024-07-12 22:35:21.975541] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fa4210 00:28:11.669 [2024-07-12 22:35:21.975605] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.669 BaseBdev2 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:11.669 22:35:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:11.927 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:12.186 [ 00:28:12.186 { 00:28:12.186 "name": "BaseBdev2", 00:28:12.186 "aliases": [ 00:28:12.186 "91c1a00b-db6f-4b5c-9577-7839ac4b3f15" 00:28:12.186 ], 00:28:12.186 "product_name": "Malloc disk", 00:28:12.186 "block_size": 4096, 00:28:12.186 "num_blocks": 8192, 00:28:12.186 "uuid": "91c1a00b-db6f-4b5c-9577-7839ac4b3f15", 00:28:12.186 "md_size": 32, 00:28:12.186 "md_interleave": false, 00:28:12.186 "dif_type": 0, 00:28:12.186 "assigned_rate_limits": { 00:28:12.186 "rw_ios_per_sec": 0, 00:28:12.186 "rw_mbytes_per_sec": 0, 00:28:12.186 "r_mbytes_per_sec": 0, 00:28:12.186 "w_mbytes_per_sec": 0 00:28:12.186 }, 00:28:12.186 "claimed": true, 00:28:12.186 "claim_type": "exclusive_write", 00:28:12.186 "zoned": false, 00:28:12.186 "supported_io_types": { 00:28:12.186 "read": true, 00:28:12.186 "write": true, 00:28:12.186 "unmap": true, 00:28:12.186 "flush": true, 00:28:12.186 "reset": true, 00:28:12.186 "nvme_admin": false, 00:28:12.186 "nvme_io": false, 00:28:12.186 "nvme_io_md": false, 00:28:12.186 "write_zeroes": true, 00:28:12.186 "zcopy": true, 00:28:12.186 "get_zone_info": false, 00:28:12.186 "zone_management": false, 00:28:12.186 "zone_append": false, 00:28:12.186 "compare": false, 00:28:12.186 "compare_and_write": false, 00:28:12.186 "abort": true, 00:28:12.186 "seek_hole": false, 00:28:12.186 "seek_data": false, 00:28:12.186 "copy": true, 00:28:12.186 "nvme_iov_md": false 00:28:12.186 }, 00:28:12.186 "memory_domains": [ 00:28:12.186 { 00:28:12.186 "dma_device_id": "system", 00:28:12.186 "dma_device_type": 1 00:28:12.186 }, 00:28:12.186 { 00:28:12.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:12.186 "dma_device_type": 2 00:28:12.186 } 00:28:12.186 ], 00:28:12.186 "driver_specific": {} 00:28:12.186 } 00:28:12.186 ] 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.186 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:12.445 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.445 "name": "Existed_Raid", 00:28:12.445 "uuid": "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2", 00:28:12.445 "strip_size_kb": 0, 00:28:12.445 "state": "online", 00:28:12.445 "raid_level": "raid1", 00:28:12.445 "superblock": true, 00:28:12.445 "num_base_bdevs": 2, 00:28:12.445 "num_base_bdevs_discovered": 2, 00:28:12.445 "num_base_bdevs_operational": 2, 00:28:12.445 "base_bdevs_list": [ 00:28:12.445 { 00:28:12.445 "name": "BaseBdev1", 00:28:12.445 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:12.445 "is_configured": true, 00:28:12.445 "data_offset": 256, 00:28:12.445 "data_size": 7936 00:28:12.445 }, 00:28:12.445 { 00:28:12.445 "name": "BaseBdev2", 00:28:12.445 "uuid": "91c1a00b-db6f-4b5c-9577-7839ac4b3f15", 00:28:12.445 "is_configured": true, 00:28:12.445 "data_offset": 256, 00:28:12.445 "data_size": 7936 00:28:12.445 } 00:28:12.445 ] 00:28:12.445 }' 00:28:12.445 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.445 22:35:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:13.010 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:13.269 [2024-07-12 22:35:23.343129] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:13.269 "name": "Existed_Raid", 00:28:13.269 "aliases": [ 00:28:13.269 "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2" 00:28:13.269 ], 00:28:13.269 "product_name": "Raid Volume", 00:28:13.269 "block_size": 4096, 00:28:13.269 "num_blocks": 7936, 00:28:13.269 "uuid": "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2", 00:28:13.269 "md_size": 32, 00:28:13.269 "md_interleave": false, 00:28:13.269 "dif_type": 0, 00:28:13.269 "assigned_rate_limits": { 00:28:13.269 "rw_ios_per_sec": 0, 00:28:13.269 "rw_mbytes_per_sec": 0, 00:28:13.269 "r_mbytes_per_sec": 0, 00:28:13.269 "w_mbytes_per_sec": 0 00:28:13.269 }, 00:28:13.269 "claimed": false, 00:28:13.269 "zoned": false, 00:28:13.269 "supported_io_types": { 00:28:13.269 "read": true, 00:28:13.269 "write": true, 00:28:13.269 "unmap": false, 00:28:13.269 "flush": false, 00:28:13.269 "reset": true, 00:28:13.269 "nvme_admin": false, 00:28:13.269 "nvme_io": false, 00:28:13.269 "nvme_io_md": false, 00:28:13.269 "write_zeroes": true, 00:28:13.269 "zcopy": false, 00:28:13.269 "get_zone_info": false, 00:28:13.269 "zone_management": false, 00:28:13.269 "zone_append": false, 00:28:13.269 "compare": false, 00:28:13.269 "compare_and_write": false, 00:28:13.269 "abort": false, 00:28:13.269 "seek_hole": false, 00:28:13.269 "seek_data": false, 00:28:13.269 "copy": false, 00:28:13.269 "nvme_iov_md": false 00:28:13.269 }, 00:28:13.269 "memory_domains": [ 00:28:13.269 { 00:28:13.269 "dma_device_id": "system", 00:28:13.269 "dma_device_type": 1 00:28:13.269 }, 00:28:13.269 { 00:28:13.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.269 "dma_device_type": 2 00:28:13.269 }, 00:28:13.269 { 00:28:13.269 "dma_device_id": "system", 00:28:13.269 "dma_device_type": 1 00:28:13.269 }, 00:28:13.269 { 00:28:13.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.269 "dma_device_type": 2 00:28:13.269 } 00:28:13.269 ], 00:28:13.269 "driver_specific": { 00:28:13.269 "raid": { 00:28:13.269 "uuid": "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2", 00:28:13.269 "strip_size_kb": 0, 00:28:13.269 "state": "online", 00:28:13.269 "raid_level": "raid1", 00:28:13.269 "superblock": true, 00:28:13.269 "num_base_bdevs": 2, 00:28:13.269 "num_base_bdevs_discovered": 2, 00:28:13.269 "num_base_bdevs_operational": 2, 00:28:13.269 "base_bdevs_list": [ 00:28:13.269 { 00:28:13.269 "name": "BaseBdev1", 00:28:13.269 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:13.269 "is_configured": true, 00:28:13.269 "data_offset": 256, 00:28:13.269 "data_size": 7936 00:28:13.269 }, 00:28:13.269 { 00:28:13.269 "name": "BaseBdev2", 00:28:13.269 "uuid": "91c1a00b-db6f-4b5c-9577-7839ac4b3f15", 00:28:13.269 "is_configured": true, 00:28:13.269 "data_offset": 256, 00:28:13.269 "data_size": 7936 00:28:13.269 } 00:28:13.269 ] 00:28:13.269 } 00:28:13.269 } 00:28:13.269 }' 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:13.269 BaseBdev2' 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:13.269 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:13.528 "name": "BaseBdev1", 00:28:13.528 "aliases": [ 00:28:13.528 "27d5a131-0e9f-46d2-93e4-7da95cdcfced" 00:28:13.528 ], 00:28:13.528 "product_name": "Malloc disk", 00:28:13.528 "block_size": 4096, 00:28:13.528 "num_blocks": 8192, 00:28:13.528 "uuid": "27d5a131-0e9f-46d2-93e4-7da95cdcfced", 00:28:13.528 "md_size": 32, 00:28:13.528 "md_interleave": false, 00:28:13.528 "dif_type": 0, 00:28:13.528 "assigned_rate_limits": { 00:28:13.528 "rw_ios_per_sec": 0, 00:28:13.528 "rw_mbytes_per_sec": 0, 00:28:13.528 "r_mbytes_per_sec": 0, 00:28:13.528 "w_mbytes_per_sec": 0 00:28:13.528 }, 00:28:13.528 "claimed": true, 00:28:13.528 "claim_type": "exclusive_write", 00:28:13.528 "zoned": false, 00:28:13.528 "supported_io_types": { 00:28:13.528 "read": true, 00:28:13.528 "write": true, 00:28:13.528 "unmap": true, 00:28:13.528 "flush": true, 00:28:13.528 "reset": true, 00:28:13.528 "nvme_admin": false, 00:28:13.528 "nvme_io": false, 00:28:13.528 "nvme_io_md": false, 00:28:13.528 "write_zeroes": true, 00:28:13.528 "zcopy": true, 00:28:13.528 "get_zone_info": false, 00:28:13.528 "zone_management": false, 00:28:13.528 "zone_append": false, 00:28:13.528 "compare": false, 00:28:13.528 "compare_and_write": false, 00:28:13.528 "abort": true, 00:28:13.528 "seek_hole": false, 00:28:13.528 "seek_data": false, 00:28:13.528 "copy": true, 00:28:13.528 "nvme_iov_md": false 00:28:13.528 }, 00:28:13.528 "memory_domains": [ 00:28:13.528 { 00:28:13.528 "dma_device_id": "system", 00:28:13.528 "dma_device_type": 1 00:28:13.528 }, 00:28:13.528 { 00:28:13.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:13.528 "dma_device_type": 2 00:28:13.528 } 00:28:13.528 ], 00:28:13.528 "driver_specific": {} 00:28:13.528 }' 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:13.528 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.787 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:13.787 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:13.787 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.787 22:35:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:13.787 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:13.787 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:13.787 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:13.787 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:14.046 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:14.046 "name": "BaseBdev2", 00:28:14.046 "aliases": [ 00:28:14.046 "91c1a00b-db6f-4b5c-9577-7839ac4b3f15" 00:28:14.046 ], 00:28:14.046 "product_name": "Malloc disk", 00:28:14.046 "block_size": 4096, 00:28:14.046 "num_blocks": 8192, 00:28:14.046 "uuid": "91c1a00b-db6f-4b5c-9577-7839ac4b3f15", 00:28:14.046 "md_size": 32, 00:28:14.046 "md_interleave": false, 00:28:14.046 "dif_type": 0, 00:28:14.046 "assigned_rate_limits": { 00:28:14.046 "rw_ios_per_sec": 0, 00:28:14.046 "rw_mbytes_per_sec": 0, 00:28:14.046 "r_mbytes_per_sec": 0, 00:28:14.046 "w_mbytes_per_sec": 0 00:28:14.046 }, 00:28:14.046 "claimed": true, 00:28:14.046 "claim_type": "exclusive_write", 00:28:14.046 "zoned": false, 00:28:14.046 "supported_io_types": { 00:28:14.046 "read": true, 00:28:14.046 "write": true, 00:28:14.046 "unmap": true, 00:28:14.046 "flush": true, 00:28:14.046 "reset": true, 00:28:14.046 "nvme_admin": false, 00:28:14.046 "nvme_io": false, 00:28:14.046 "nvme_io_md": false, 00:28:14.046 "write_zeroes": true, 00:28:14.046 "zcopy": true, 00:28:14.046 "get_zone_info": false, 00:28:14.046 "zone_management": false, 00:28:14.046 "zone_append": false, 00:28:14.046 "compare": false, 00:28:14.046 "compare_and_write": false, 00:28:14.046 "abort": true, 00:28:14.046 "seek_hole": false, 00:28:14.046 "seek_data": false, 00:28:14.046 "copy": true, 00:28:14.046 "nvme_iov_md": false 00:28:14.046 }, 00:28:14.046 "memory_domains": [ 00:28:14.046 { 00:28:14.046 "dma_device_id": "system", 00:28:14.046 "dma_device_type": 1 00:28:14.046 }, 00:28:14.046 { 00:28:14.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:14.046 "dma_device_type": 2 00:28:14.046 } 00:28:14.046 ], 00:28:14.046 "driver_specific": {} 00:28:14.046 }' 00:28:14.046 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.046 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:14.046 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:14.046 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:14.304 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:14.563 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:14.563 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:14.563 [2024-07-12 22:35:24.858924] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.821 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.822 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.822 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.822 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:14.822 22:35:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.822 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.822 "name": "Existed_Raid", 00:28:14.822 "uuid": "9a4c2dc6-5bd8-4d54-9e8a-de3cb7a306c2", 00:28:14.822 "strip_size_kb": 0, 00:28:14.822 "state": "online", 00:28:14.822 "raid_level": "raid1", 00:28:14.822 "superblock": true, 00:28:14.822 "num_base_bdevs": 2, 00:28:14.822 "num_base_bdevs_discovered": 1, 00:28:14.822 "num_base_bdevs_operational": 1, 00:28:14.822 "base_bdevs_list": [ 00:28:14.822 { 00:28:14.822 "name": null, 00:28:14.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.822 "is_configured": false, 00:28:14.822 "data_offset": 256, 00:28:14.822 "data_size": 7936 00:28:14.822 }, 00:28:14.822 { 00:28:14.822 "name": "BaseBdev2", 00:28:14.822 "uuid": "91c1a00b-db6f-4b5c-9577-7839ac4b3f15", 00:28:14.822 "is_configured": true, 00:28:14.822 "data_offset": 256, 00:28:14.822 "data_size": 7936 00:28:14.822 } 00:28:14.822 ] 00:28:14.822 }' 00:28:14.822 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.822 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:15.388 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:15.388 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:15.388 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.388 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:15.646 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:15.646 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:15.646 22:35:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:15.906 [2024-07-12 22:35:26.126783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:15.906 [2024-07-12 22:35:26.126872] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:15.906 [2024-07-12 22:35:26.138302] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:15.906 [2024-07-12 22:35:26.138338] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:15.906 [2024-07-12 22:35:26.138350] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa4210 name Existed_Raid, state offline 00:28:15.906 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:15.906 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:15.906 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.906 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:16.165 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:16.165 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 3567867 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3567867 ']' 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 3567867 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3567867 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3567867' 00:28:16.166 killing process with pid 3567867 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 3567867 00:28:16.166 [2024-07-12 22:35:26.453272] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:16.166 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 3567867 00:28:16.166 [2024-07-12 22:35:26.454191] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:16.424 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:16.424 00:28:16.424 real 0m10.103s 00:28:16.424 user 0m17.899s 00:28:16.424 sys 0m1.928s 00:28:16.425 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:16.425 22:35:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.425 ************************************ 00:28:16.425 END TEST raid_state_function_test_sb_md_separate 00:28:16.425 ************************************ 00:28:16.425 22:35:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:16.425 22:35:26 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:16.425 22:35:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:16.425 22:35:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:16.425 22:35:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:16.683 ************************************ 00:28:16.683 START TEST raid_superblock_test_md_separate 00:28:16.683 ************************************ 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=3569329 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 3569329 /var/tmp/spdk-raid.sock 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3569329 ']' 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:16.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:16.683 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:16.684 22:35:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.684 [2024-07-12 22:35:26.827383] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:28:16.684 [2024-07-12 22:35:26.827448] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3569329 ] 00:28:16.684 [2024-07-12 22:35:26.958504] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:16.943 [2024-07-12 22:35:27.070135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:16.943 [2024-07-12 22:35:27.139493] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:16.943 [2024-07-12 22:35:27.139531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:17.513 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:17.772 malloc1 00:28:17.772 22:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:18.031 [2024-07-12 22:35:28.222535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:18.031 [2024-07-12 22:35:28.222582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.031 [2024-07-12 22:35:28.222603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c2f830 00:28:18.031 [2024-07-12 22:35:28.222616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.031 [2024-07-12 22:35:28.224180] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.031 [2024-07-12 22:35:28.224212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:18.031 pt1 00:28:18.031 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:18.031 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:18.031 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:18.031 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:18.031 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:18.032 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:18.032 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:18.032 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:18.032 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:18.290 malloc2 00:28:18.290 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:18.548 [2024-07-12 22:35:28.722764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:18.548 [2024-07-12 22:35:28.722809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.548 [2024-07-12 22:35:28.722830] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c21250 00:28:18.548 [2024-07-12 22:35:28.722843] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.548 [2024-07-12 22:35:28.724300] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.548 [2024-07-12 22:35:28.724326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:18.548 pt2 00:28:18.548 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:18.548 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:18.548 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:18.806 [2024-07-12 22:35:28.967427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:18.806 [2024-07-12 22:35:28.968839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:18.806 [2024-07-12 22:35:28.968996] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c21d20 00:28:18.806 [2024-07-12 22:35:28.969010] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:18.806 [2024-07-12 22:35:28.969087] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c15a60 00:28:18.806 [2024-07-12 22:35:28.969204] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c21d20 00:28:18.806 [2024-07-12 22:35:28.969214] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c21d20 00:28:18.806 [2024-07-12 22:35:28.969286] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.806 22:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.063 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.063 "name": "raid_bdev1", 00:28:19.063 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:19.063 "strip_size_kb": 0, 00:28:19.063 "state": "online", 00:28:19.063 "raid_level": "raid1", 00:28:19.063 "superblock": true, 00:28:19.063 "num_base_bdevs": 2, 00:28:19.063 "num_base_bdevs_discovered": 2, 00:28:19.063 "num_base_bdevs_operational": 2, 00:28:19.063 "base_bdevs_list": [ 00:28:19.063 { 00:28:19.063 "name": "pt1", 00:28:19.063 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:19.063 "is_configured": true, 00:28:19.063 "data_offset": 256, 00:28:19.063 "data_size": 7936 00:28:19.063 }, 00:28:19.063 { 00:28:19.063 "name": "pt2", 00:28:19.063 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:19.063 "is_configured": true, 00:28:19.063 "data_offset": 256, 00:28:19.063 "data_size": 7936 00:28:19.063 } 00:28:19.063 ] 00:28:19.063 }' 00:28:19.063 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.064 22:35:29 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:19.629 22:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:19.887 [2024-07-12 22:35:30.022484] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:19.887 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:19.887 "name": "raid_bdev1", 00:28:19.887 "aliases": [ 00:28:19.887 "71abfa54-7205-4c88-a3b7-1d3516d61613" 00:28:19.887 ], 00:28:19.887 "product_name": "Raid Volume", 00:28:19.887 "block_size": 4096, 00:28:19.887 "num_blocks": 7936, 00:28:19.887 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:19.887 "md_size": 32, 00:28:19.887 "md_interleave": false, 00:28:19.887 "dif_type": 0, 00:28:19.888 "assigned_rate_limits": { 00:28:19.888 "rw_ios_per_sec": 0, 00:28:19.888 "rw_mbytes_per_sec": 0, 00:28:19.888 "r_mbytes_per_sec": 0, 00:28:19.888 "w_mbytes_per_sec": 0 00:28:19.888 }, 00:28:19.888 "claimed": false, 00:28:19.888 "zoned": false, 00:28:19.888 "supported_io_types": { 00:28:19.888 "read": true, 00:28:19.888 "write": true, 00:28:19.888 "unmap": false, 00:28:19.888 "flush": false, 00:28:19.888 "reset": true, 00:28:19.888 "nvme_admin": false, 00:28:19.888 "nvme_io": false, 00:28:19.888 "nvme_io_md": false, 00:28:19.888 "write_zeroes": true, 00:28:19.888 "zcopy": false, 00:28:19.888 "get_zone_info": false, 00:28:19.888 "zone_management": false, 00:28:19.888 "zone_append": false, 00:28:19.888 "compare": false, 00:28:19.888 "compare_and_write": false, 00:28:19.888 "abort": false, 00:28:19.888 "seek_hole": false, 00:28:19.888 "seek_data": false, 00:28:19.888 "copy": false, 00:28:19.888 "nvme_iov_md": false 00:28:19.888 }, 00:28:19.888 "memory_domains": [ 00:28:19.888 { 00:28:19.888 "dma_device_id": "system", 00:28:19.888 "dma_device_type": 1 00:28:19.888 }, 00:28:19.888 { 00:28:19.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:19.888 "dma_device_type": 2 00:28:19.888 }, 00:28:19.888 { 00:28:19.888 "dma_device_id": "system", 00:28:19.888 "dma_device_type": 1 00:28:19.888 }, 00:28:19.888 { 00:28:19.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:19.888 "dma_device_type": 2 00:28:19.888 } 00:28:19.888 ], 00:28:19.888 "driver_specific": { 00:28:19.888 "raid": { 00:28:19.888 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:19.888 "strip_size_kb": 0, 00:28:19.888 "state": "online", 00:28:19.888 "raid_level": "raid1", 00:28:19.888 "superblock": true, 00:28:19.888 "num_base_bdevs": 2, 00:28:19.888 "num_base_bdevs_discovered": 2, 00:28:19.888 "num_base_bdevs_operational": 2, 00:28:19.888 "base_bdevs_list": [ 00:28:19.888 { 00:28:19.888 "name": "pt1", 00:28:19.888 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:19.888 "is_configured": true, 00:28:19.888 "data_offset": 256, 00:28:19.888 "data_size": 7936 00:28:19.888 }, 00:28:19.888 { 00:28:19.888 "name": "pt2", 00:28:19.888 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:19.888 "is_configured": true, 00:28:19.888 "data_offset": 256, 00:28:19.888 "data_size": 7936 00:28:19.888 } 00:28:19.888 ] 00:28:19.888 } 00:28:19.888 } 00:28:19.888 }' 00:28:19.888 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:19.888 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:19.888 pt2' 00:28:19.888 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:19.888 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:19.888 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:20.144 "name": "pt1", 00:28:20.144 "aliases": [ 00:28:20.144 "00000000-0000-0000-0000-000000000001" 00:28:20.144 ], 00:28:20.144 "product_name": "passthru", 00:28:20.144 "block_size": 4096, 00:28:20.144 "num_blocks": 8192, 00:28:20.144 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:20.144 "md_size": 32, 00:28:20.144 "md_interleave": false, 00:28:20.144 "dif_type": 0, 00:28:20.144 "assigned_rate_limits": { 00:28:20.144 "rw_ios_per_sec": 0, 00:28:20.144 "rw_mbytes_per_sec": 0, 00:28:20.144 "r_mbytes_per_sec": 0, 00:28:20.144 "w_mbytes_per_sec": 0 00:28:20.144 }, 00:28:20.144 "claimed": true, 00:28:20.144 "claim_type": "exclusive_write", 00:28:20.144 "zoned": false, 00:28:20.144 "supported_io_types": { 00:28:20.144 "read": true, 00:28:20.144 "write": true, 00:28:20.144 "unmap": true, 00:28:20.144 "flush": true, 00:28:20.144 "reset": true, 00:28:20.144 "nvme_admin": false, 00:28:20.144 "nvme_io": false, 00:28:20.144 "nvme_io_md": false, 00:28:20.144 "write_zeroes": true, 00:28:20.144 "zcopy": true, 00:28:20.144 "get_zone_info": false, 00:28:20.144 "zone_management": false, 00:28:20.144 "zone_append": false, 00:28:20.144 "compare": false, 00:28:20.144 "compare_and_write": false, 00:28:20.144 "abort": true, 00:28:20.144 "seek_hole": false, 00:28:20.144 "seek_data": false, 00:28:20.144 "copy": true, 00:28:20.144 "nvme_iov_md": false 00:28:20.144 }, 00:28:20.144 "memory_domains": [ 00:28:20.144 { 00:28:20.144 "dma_device_id": "system", 00:28:20.144 "dma_device_type": 1 00:28:20.144 }, 00:28:20.144 { 00:28:20.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:20.144 "dma_device_type": 2 00:28:20.144 } 00:28:20.144 ], 00:28:20.144 "driver_specific": { 00:28:20.144 "passthru": { 00:28:20.144 "name": "pt1", 00:28:20.144 "base_bdev_name": "malloc1" 00:28:20.144 } 00:28:20.144 } 00:28:20.144 }' 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:20.144 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:20.416 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:20.717 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:20.717 "name": "pt2", 00:28:20.717 "aliases": [ 00:28:20.717 "00000000-0000-0000-0000-000000000002" 00:28:20.717 ], 00:28:20.717 "product_name": "passthru", 00:28:20.717 "block_size": 4096, 00:28:20.717 "num_blocks": 8192, 00:28:20.717 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:20.717 "md_size": 32, 00:28:20.717 "md_interleave": false, 00:28:20.717 "dif_type": 0, 00:28:20.717 "assigned_rate_limits": { 00:28:20.717 "rw_ios_per_sec": 0, 00:28:20.717 "rw_mbytes_per_sec": 0, 00:28:20.717 "r_mbytes_per_sec": 0, 00:28:20.717 "w_mbytes_per_sec": 0 00:28:20.717 }, 00:28:20.717 "claimed": true, 00:28:20.717 "claim_type": "exclusive_write", 00:28:20.717 "zoned": false, 00:28:20.717 "supported_io_types": { 00:28:20.717 "read": true, 00:28:20.717 "write": true, 00:28:20.717 "unmap": true, 00:28:20.717 "flush": true, 00:28:20.717 "reset": true, 00:28:20.717 "nvme_admin": false, 00:28:20.717 "nvme_io": false, 00:28:20.717 "nvme_io_md": false, 00:28:20.717 "write_zeroes": true, 00:28:20.717 "zcopy": true, 00:28:20.717 "get_zone_info": false, 00:28:20.717 "zone_management": false, 00:28:20.717 "zone_append": false, 00:28:20.717 "compare": false, 00:28:20.717 "compare_and_write": false, 00:28:20.717 "abort": true, 00:28:20.717 "seek_hole": false, 00:28:20.717 "seek_data": false, 00:28:20.717 "copy": true, 00:28:20.717 "nvme_iov_md": false 00:28:20.717 }, 00:28:20.717 "memory_domains": [ 00:28:20.717 { 00:28:20.717 "dma_device_id": "system", 00:28:20.717 "dma_device_type": 1 00:28:20.717 }, 00:28:20.717 { 00:28:20.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:20.717 "dma_device_type": 2 00:28:20.717 } 00:28:20.717 ], 00:28:20.717 "driver_specific": { 00:28:20.717 "passthru": { 00:28:20.717 "name": "pt2", 00:28:20.717 "base_bdev_name": "malloc2" 00:28:20.717 } 00:28:20.717 } 00:28:20.717 }' 00:28:20.717 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:20.717 22:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:20.717 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:20.717 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:20.976 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:21.234 [2024-07-12 22:35:31.522454] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:21.234 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=71abfa54-7205-4c88-a3b7-1d3516d61613 00:28:21.234 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 71abfa54-7205-4c88-a3b7-1d3516d61613 ']' 00:28:21.234 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:21.493 [2024-07-12 22:35:31.770862] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:21.493 [2024-07-12 22:35:31.770880] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:21.493 [2024-07-12 22:35:31.770937] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:21.493 [2024-07-12 22:35:31.770990] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:21.493 [2024-07-12 22:35:31.771006] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c21d20 name raid_bdev1, state offline 00:28:21.493 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.493 22:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:21.753 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:21.753 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:21.753 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:21.753 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:22.012 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:22.012 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:22.271 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:22.272 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:22.531 22:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:22.797 [2024-07-12 22:35:33.014128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:22.797 [2024-07-12 22:35:33.015510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:22.797 [2024-07-12 22:35:33.015567] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:22.797 [2024-07-12 22:35:33.015613] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:22.797 [2024-07-12 22:35:33.015632] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:22.797 [2024-07-12 22:35:33.015643] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a91ed0 name raid_bdev1, state configuring 00:28:22.797 request: 00:28:22.797 { 00:28:22.797 "name": "raid_bdev1", 00:28:22.797 "raid_level": "raid1", 00:28:22.797 "base_bdevs": [ 00:28:22.797 "malloc1", 00:28:22.797 "malloc2" 00:28:22.797 ], 00:28:22.797 "superblock": false, 00:28:22.797 "method": "bdev_raid_create", 00:28:22.797 "req_id": 1 00:28:22.797 } 00:28:22.797 Got JSON-RPC error response 00:28:22.797 response: 00:28:22.797 { 00:28:22.797 "code": -17, 00:28:22.797 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:22.797 } 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.797 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:23.055 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:23.055 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:23.055 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:23.314 [2024-07-12 22:35:33.491328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:23.314 [2024-07-12 22:35:33.491368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:23.314 [2024-07-12 22:35:33.491385] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c2fee0 00:28:23.314 [2024-07-12 22:35:33.491398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:23.314 [2024-07-12 22:35:33.492824] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:23.314 [2024-07-12 22:35:33.492849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:23.314 [2024-07-12 22:35:33.492893] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:23.314 [2024-07-12 22:35:33.492917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:23.314 pt1 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.314 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.573 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.573 "name": "raid_bdev1", 00:28:23.573 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:23.573 "strip_size_kb": 0, 00:28:23.573 "state": "configuring", 00:28:23.573 "raid_level": "raid1", 00:28:23.573 "superblock": true, 00:28:23.573 "num_base_bdevs": 2, 00:28:23.573 "num_base_bdevs_discovered": 1, 00:28:23.573 "num_base_bdevs_operational": 2, 00:28:23.573 "base_bdevs_list": [ 00:28:23.573 { 00:28:23.573 "name": "pt1", 00:28:23.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:23.573 "is_configured": true, 00:28:23.573 "data_offset": 256, 00:28:23.573 "data_size": 7936 00:28:23.573 }, 00:28:23.573 { 00:28:23.573 "name": null, 00:28:23.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:23.573 "is_configured": false, 00:28:23.573 "data_offset": 256, 00:28:23.573 "data_size": 7936 00:28:23.573 } 00:28:23.573 ] 00:28:23.573 }' 00:28:23.573 22:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.573 22:35:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:24.141 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:24.141 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:24.141 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:24.141 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:24.400 [2024-07-12 22:35:34.538113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:24.400 [2024-07-12 22:35:34.538162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:24.400 [2024-07-12 22:35:34.538179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a92490 00:28:24.400 [2024-07-12 22:35:34.538192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:24.400 [2024-07-12 22:35:34.538377] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:24.400 [2024-07-12 22:35:34.538393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:24.401 [2024-07-12 22:35:34.538433] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:24.401 [2024-07-12 22:35:34.538450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:24.401 [2024-07-12 22:35:34.538539] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c165d0 00:28:24.401 [2024-07-12 22:35:34.538549] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:24.401 [2024-07-12 22:35:34.538604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c17800 00:28:24.401 [2024-07-12 22:35:34.538704] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c165d0 00:28:24.401 [2024-07-12 22:35:34.538713] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c165d0 00:28:24.401 [2024-07-12 22:35:34.538780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:24.401 pt2 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.401 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.660 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:24.660 "name": "raid_bdev1", 00:28:24.660 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:24.660 "strip_size_kb": 0, 00:28:24.660 "state": "online", 00:28:24.660 "raid_level": "raid1", 00:28:24.660 "superblock": true, 00:28:24.660 "num_base_bdevs": 2, 00:28:24.660 "num_base_bdevs_discovered": 2, 00:28:24.660 "num_base_bdevs_operational": 2, 00:28:24.660 "base_bdevs_list": [ 00:28:24.660 { 00:28:24.660 "name": "pt1", 00:28:24.660 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:24.660 "is_configured": true, 00:28:24.660 "data_offset": 256, 00:28:24.660 "data_size": 7936 00:28:24.660 }, 00:28:24.660 { 00:28:24.660 "name": "pt2", 00:28:24.660 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:24.660 "is_configured": true, 00:28:24.660 "data_offset": 256, 00:28:24.660 "data_size": 7936 00:28:24.660 } 00:28:24.660 ] 00:28:24.660 }' 00:28:24.660 22:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:24.661 22:35:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:25.229 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:25.488 [2024-07-12 22:35:35.629282] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:25.488 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:25.488 "name": "raid_bdev1", 00:28:25.488 "aliases": [ 00:28:25.488 "71abfa54-7205-4c88-a3b7-1d3516d61613" 00:28:25.488 ], 00:28:25.488 "product_name": "Raid Volume", 00:28:25.488 "block_size": 4096, 00:28:25.488 "num_blocks": 7936, 00:28:25.488 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:25.488 "md_size": 32, 00:28:25.488 "md_interleave": false, 00:28:25.488 "dif_type": 0, 00:28:25.488 "assigned_rate_limits": { 00:28:25.488 "rw_ios_per_sec": 0, 00:28:25.488 "rw_mbytes_per_sec": 0, 00:28:25.488 "r_mbytes_per_sec": 0, 00:28:25.488 "w_mbytes_per_sec": 0 00:28:25.488 }, 00:28:25.488 "claimed": false, 00:28:25.488 "zoned": false, 00:28:25.488 "supported_io_types": { 00:28:25.488 "read": true, 00:28:25.488 "write": true, 00:28:25.488 "unmap": false, 00:28:25.488 "flush": false, 00:28:25.488 "reset": true, 00:28:25.488 "nvme_admin": false, 00:28:25.488 "nvme_io": false, 00:28:25.488 "nvme_io_md": false, 00:28:25.488 "write_zeroes": true, 00:28:25.488 "zcopy": false, 00:28:25.488 "get_zone_info": false, 00:28:25.488 "zone_management": false, 00:28:25.488 "zone_append": false, 00:28:25.488 "compare": false, 00:28:25.488 "compare_and_write": false, 00:28:25.488 "abort": false, 00:28:25.488 "seek_hole": false, 00:28:25.488 "seek_data": false, 00:28:25.488 "copy": false, 00:28:25.488 "nvme_iov_md": false 00:28:25.488 }, 00:28:25.488 "memory_domains": [ 00:28:25.488 { 00:28:25.488 "dma_device_id": "system", 00:28:25.488 "dma_device_type": 1 00:28:25.488 }, 00:28:25.488 { 00:28:25.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.488 "dma_device_type": 2 00:28:25.488 }, 00:28:25.488 { 00:28:25.488 "dma_device_id": "system", 00:28:25.488 "dma_device_type": 1 00:28:25.488 }, 00:28:25.488 { 00:28:25.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.488 "dma_device_type": 2 00:28:25.488 } 00:28:25.488 ], 00:28:25.488 "driver_specific": { 00:28:25.488 "raid": { 00:28:25.488 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:25.488 "strip_size_kb": 0, 00:28:25.488 "state": "online", 00:28:25.488 "raid_level": "raid1", 00:28:25.488 "superblock": true, 00:28:25.488 "num_base_bdevs": 2, 00:28:25.488 "num_base_bdevs_discovered": 2, 00:28:25.488 "num_base_bdevs_operational": 2, 00:28:25.488 "base_bdevs_list": [ 00:28:25.488 { 00:28:25.488 "name": "pt1", 00:28:25.488 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:25.488 "is_configured": true, 00:28:25.488 "data_offset": 256, 00:28:25.488 "data_size": 7936 00:28:25.488 }, 00:28:25.488 { 00:28:25.488 "name": "pt2", 00:28:25.488 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:25.488 "is_configured": true, 00:28:25.488 "data_offset": 256, 00:28:25.488 "data_size": 7936 00:28:25.488 } 00:28:25.488 ] 00:28:25.488 } 00:28:25.488 } 00:28:25.488 }' 00:28:25.489 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:25.489 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:25.489 pt2' 00:28:25.489 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:25.489 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:25.489 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:25.748 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:25.748 "name": "pt1", 00:28:25.748 "aliases": [ 00:28:25.748 "00000000-0000-0000-0000-000000000001" 00:28:25.748 ], 00:28:25.748 "product_name": "passthru", 00:28:25.748 "block_size": 4096, 00:28:25.748 "num_blocks": 8192, 00:28:25.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:25.748 "md_size": 32, 00:28:25.748 "md_interleave": false, 00:28:25.748 "dif_type": 0, 00:28:25.748 "assigned_rate_limits": { 00:28:25.748 "rw_ios_per_sec": 0, 00:28:25.748 "rw_mbytes_per_sec": 0, 00:28:25.748 "r_mbytes_per_sec": 0, 00:28:25.748 "w_mbytes_per_sec": 0 00:28:25.748 }, 00:28:25.748 "claimed": true, 00:28:25.748 "claim_type": "exclusive_write", 00:28:25.748 "zoned": false, 00:28:25.748 "supported_io_types": { 00:28:25.748 "read": true, 00:28:25.748 "write": true, 00:28:25.748 "unmap": true, 00:28:25.748 "flush": true, 00:28:25.748 "reset": true, 00:28:25.748 "nvme_admin": false, 00:28:25.748 "nvme_io": false, 00:28:25.748 "nvme_io_md": false, 00:28:25.748 "write_zeroes": true, 00:28:25.748 "zcopy": true, 00:28:25.748 "get_zone_info": false, 00:28:25.748 "zone_management": false, 00:28:25.748 "zone_append": false, 00:28:25.748 "compare": false, 00:28:25.748 "compare_and_write": false, 00:28:25.748 "abort": true, 00:28:25.748 "seek_hole": false, 00:28:25.748 "seek_data": false, 00:28:25.748 "copy": true, 00:28:25.748 "nvme_iov_md": false 00:28:25.748 }, 00:28:25.748 "memory_domains": [ 00:28:25.748 { 00:28:25.748 "dma_device_id": "system", 00:28:25.748 "dma_device_type": 1 00:28:25.748 }, 00:28:25.748 { 00:28:25.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.748 "dma_device_type": 2 00:28:25.748 } 00:28:25.748 ], 00:28:25.748 "driver_specific": { 00:28:25.748 "passthru": { 00:28:25.748 "name": "pt1", 00:28:25.748 "base_bdev_name": "malloc1" 00:28:25.748 } 00:28:25.748 } 00:28:25.748 }' 00:28:25.748 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:25.748 22:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:25.748 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:25.748 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:25.748 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:26.007 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:26.267 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:26.267 "name": "pt2", 00:28:26.267 "aliases": [ 00:28:26.267 "00000000-0000-0000-0000-000000000002" 00:28:26.267 ], 00:28:26.267 "product_name": "passthru", 00:28:26.267 "block_size": 4096, 00:28:26.267 "num_blocks": 8192, 00:28:26.267 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:26.267 "md_size": 32, 00:28:26.267 "md_interleave": false, 00:28:26.267 "dif_type": 0, 00:28:26.267 "assigned_rate_limits": { 00:28:26.267 "rw_ios_per_sec": 0, 00:28:26.267 "rw_mbytes_per_sec": 0, 00:28:26.267 "r_mbytes_per_sec": 0, 00:28:26.267 "w_mbytes_per_sec": 0 00:28:26.267 }, 00:28:26.267 "claimed": true, 00:28:26.267 "claim_type": "exclusive_write", 00:28:26.267 "zoned": false, 00:28:26.267 "supported_io_types": { 00:28:26.267 "read": true, 00:28:26.267 "write": true, 00:28:26.267 "unmap": true, 00:28:26.267 "flush": true, 00:28:26.267 "reset": true, 00:28:26.267 "nvme_admin": false, 00:28:26.267 "nvme_io": false, 00:28:26.267 "nvme_io_md": false, 00:28:26.267 "write_zeroes": true, 00:28:26.267 "zcopy": true, 00:28:26.267 "get_zone_info": false, 00:28:26.267 "zone_management": false, 00:28:26.267 "zone_append": false, 00:28:26.267 "compare": false, 00:28:26.267 "compare_and_write": false, 00:28:26.267 "abort": true, 00:28:26.267 "seek_hole": false, 00:28:26.267 "seek_data": false, 00:28:26.267 "copy": true, 00:28:26.267 "nvme_iov_md": false 00:28:26.267 }, 00:28:26.267 "memory_domains": [ 00:28:26.267 { 00:28:26.267 "dma_device_id": "system", 00:28:26.267 "dma_device_type": 1 00:28:26.267 }, 00:28:26.267 { 00:28:26.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.267 "dma_device_type": 2 00:28:26.267 } 00:28:26.267 ], 00:28:26.267 "driver_specific": { 00:28:26.267 "passthru": { 00:28:26.267 "name": "pt2", 00:28:26.267 "base_bdev_name": "malloc2" 00:28:26.267 } 00:28:26.267 } 00:28:26.267 }' 00:28:26.267 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:26.526 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:26.785 22:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:26.785 [2024-07-12 22:35:37.077112] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:26.785 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 71abfa54-7205-4c88-a3b7-1d3516d61613 '!=' 71abfa54-7205-4c88-a3b7-1d3516d61613 ']' 00:28:26.785 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:26.785 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:26.785 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:26.785 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:27.044 [2024-07-12 22:35:37.321517] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.044 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.303 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.303 "name": "raid_bdev1", 00:28:27.303 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:27.303 "strip_size_kb": 0, 00:28:27.303 "state": "online", 00:28:27.303 "raid_level": "raid1", 00:28:27.303 "superblock": true, 00:28:27.303 "num_base_bdevs": 2, 00:28:27.303 "num_base_bdevs_discovered": 1, 00:28:27.303 "num_base_bdevs_operational": 1, 00:28:27.303 "base_bdevs_list": [ 00:28:27.303 { 00:28:27.303 "name": null, 00:28:27.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.303 "is_configured": false, 00:28:27.303 "data_offset": 256, 00:28:27.303 "data_size": 7936 00:28:27.303 }, 00:28:27.303 { 00:28:27.303 "name": "pt2", 00:28:27.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:27.303 "is_configured": true, 00:28:27.303 "data_offset": 256, 00:28:27.303 "data_size": 7936 00:28:27.303 } 00:28:27.303 ] 00:28:27.303 }' 00:28:27.303 22:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.303 22:35:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:27.872 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:28.132 [2024-07-12 22:35:38.356246] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:28.132 [2024-07-12 22:35:38.356272] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:28.132 [2024-07-12 22:35:38.356329] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:28.132 [2024-07-12 22:35:38.356375] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:28.132 [2024-07-12 22:35:38.356388] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c165d0 name raid_bdev1, state offline 00:28:28.132 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.132 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:28.391 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:28.391 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:28.391 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:28.391 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:28.391 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:28.650 22:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:28.909 [2024-07-12 22:35:39.098166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:28.909 [2024-07-12 22:35:39.098211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:28.909 [2024-07-12 22:35:39.098232] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c14660 00:28:28.909 [2024-07-12 22:35:39.098245] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:28.909 [2024-07-12 22:35:39.099675] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:28.909 [2024-07-12 22:35:39.099701] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:28.909 [2024-07-12 22:35:39.099747] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:28.909 [2024-07-12 22:35:39.099771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:28.909 [2024-07-12 22:35:39.099846] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c16d10 00:28:28.909 [2024-07-12 22:35:39.099856] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:28.909 [2024-07-12 22:35:39.099908] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c17560 00:28:28.909 [2024-07-12 22:35:39.100018] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c16d10 00:28:28.909 [2024-07-12 22:35:39.100029] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c16d10 00:28:28.909 [2024-07-12 22:35:39.100095] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:28.909 pt2 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.909 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.910 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.910 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.910 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.910 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.169 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.169 "name": "raid_bdev1", 00:28:29.169 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:29.169 "strip_size_kb": 0, 00:28:29.169 "state": "online", 00:28:29.169 "raid_level": "raid1", 00:28:29.169 "superblock": true, 00:28:29.169 "num_base_bdevs": 2, 00:28:29.169 "num_base_bdevs_discovered": 1, 00:28:29.169 "num_base_bdevs_operational": 1, 00:28:29.169 "base_bdevs_list": [ 00:28:29.169 { 00:28:29.169 "name": null, 00:28:29.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.169 "is_configured": false, 00:28:29.169 "data_offset": 256, 00:28:29.169 "data_size": 7936 00:28:29.169 }, 00:28:29.169 { 00:28:29.169 "name": "pt2", 00:28:29.169 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:29.169 "is_configured": true, 00:28:29.169 "data_offset": 256, 00:28:29.169 "data_size": 7936 00:28:29.169 } 00:28:29.169 ] 00:28:29.169 }' 00:28:29.169 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.169 22:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:29.739 22:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:29.998 [2024-07-12 22:35:40.181033] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:29.998 [2024-07-12 22:35:40.181063] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:29.998 [2024-07-12 22:35:40.181118] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:29.998 [2024-07-12 22:35:40.181164] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:29.998 [2024-07-12 22:35:40.181177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c16d10 name raid_bdev1, state offline 00:28:29.998 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.998 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:30.257 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:30.257 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:30.257 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:30.257 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:30.516 [2024-07-12 22:35:40.678319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:30.516 [2024-07-12 22:35:40.678365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.516 [2024-07-12 22:35:40.678383] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c15760 00:28:30.516 [2024-07-12 22:35:40.678396] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.516 [2024-07-12 22:35:40.679817] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.516 [2024-07-12 22:35:40.679843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:30.516 [2024-07-12 22:35:40.679889] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:30.516 [2024-07-12 22:35:40.679912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:30.516 [2024-07-12 22:35:40.680015] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:30.516 [2024-07-12 22:35:40.680029] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:30.516 [2024-07-12 22:35:40.680043] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c17850 name raid_bdev1, state configuring 00:28:30.516 [2024-07-12 22:35:40.680066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:30.516 [2024-07-12 22:35:40.680117] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c16850 00:28:30.516 [2024-07-12 22:35:40.680127] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:30.516 [2024-07-12 22:35:40.680181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c173b0 00:28:30.516 [2024-07-12 22:35:40.680278] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c16850 00:28:30.516 [2024-07-12 22:35:40.680287] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c16850 00:28:30.516 [2024-07-12 22:35:40.680359] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.516 pt1 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.516 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.776 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.776 "name": "raid_bdev1", 00:28:30.776 "uuid": "71abfa54-7205-4c88-a3b7-1d3516d61613", 00:28:30.776 "strip_size_kb": 0, 00:28:30.776 "state": "online", 00:28:30.776 "raid_level": "raid1", 00:28:30.776 "superblock": true, 00:28:30.776 "num_base_bdevs": 2, 00:28:30.776 "num_base_bdevs_discovered": 1, 00:28:30.776 "num_base_bdevs_operational": 1, 00:28:30.776 "base_bdevs_list": [ 00:28:30.776 { 00:28:30.776 "name": null, 00:28:30.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.776 "is_configured": false, 00:28:30.776 "data_offset": 256, 00:28:30.776 "data_size": 7936 00:28:30.776 }, 00:28:30.776 { 00:28:30.776 "name": "pt2", 00:28:30.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:30.776 "is_configured": true, 00:28:30.776 "data_offset": 256, 00:28:30.776 "data_size": 7936 00:28:30.776 } 00:28:30.776 ] 00:28:30.776 }' 00:28:30.776 22:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.776 22:35:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:31.344 22:35:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:31.344 22:35:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:31.604 22:35:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:31.604 22:35:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:31.604 22:35:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:31.863 [2024-07-12 22:35:42.002073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 71abfa54-7205-4c88-a3b7-1d3516d61613 '!=' 71abfa54-7205-4c88-a3b7-1d3516d61613 ']' 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 3569329 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3569329 ']' 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 3569329 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3569329 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3569329' 00:28:31.863 killing process with pid 3569329 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 3569329 00:28:31.863 [2024-07-12 22:35:42.075021] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:31.863 [2024-07-12 22:35:42.075074] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:31.863 [2024-07-12 22:35:42.075120] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:31.863 [2024-07-12 22:35:42.075131] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c16850 name raid_bdev1, state offline 00:28:31.863 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 3569329 00:28:31.863 [2024-07-12 22:35:42.099055] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:32.123 22:35:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:32.123 00:28:32.123 real 0m15.560s 00:28:32.123 user 0m28.202s 00:28:32.123 sys 0m2.873s 00:28:32.123 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:32.123 22:35:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:32.123 ************************************ 00:28:32.123 END TEST raid_superblock_test_md_separate 00:28:32.123 ************************************ 00:28:32.123 22:35:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:32.123 22:35:42 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:32.123 22:35:42 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:32.123 22:35:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:32.123 22:35:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:32.123 22:35:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:32.123 ************************************ 00:28:32.123 START TEST raid_rebuild_test_sb_md_separate 00:28:32.123 ************************************ 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=3571748 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 3571748 /var/tmp/spdk-raid.sock 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 3571748 ']' 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:32.123 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:32.123 22:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:32.383 [2024-07-12 22:35:42.486846] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:28:32.383 [2024-07-12 22:35:42.486917] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3571748 ] 00:28:32.383 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:32.383 Zero copy mechanism will not be used. 00:28:32.383 [2024-07-12 22:35:42.618264] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:32.642 [2024-07-12 22:35:42.722535] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:32.642 [2024-07-12 22:35:42.786976] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:32.642 [2024-07-12 22:35:42.787017] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:33.209 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:33.209 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:33.209 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:33.209 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:33.468 BaseBdev1_malloc 00:28:33.468 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:33.727 [2024-07-12 22:35:43.901519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:33.727 [2024-07-12 22:35:43.901569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:33.727 [2024-07-12 22:35:43.901594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12a16d0 00:28:33.727 [2024-07-12 22:35:43.901607] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:33.727 [2024-07-12 22:35:43.903065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:33.727 [2024-07-12 22:35:43.903092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:33.727 BaseBdev1 00:28:33.727 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:33.727 22:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:33.986 BaseBdev2_malloc 00:28:33.986 22:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:34.245 [2024-07-12 22:35:44.392237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:34.245 [2024-07-12 22:35:44.392282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.245 [2024-07-12 22:35:44.392305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f91f0 00:28:34.245 [2024-07-12 22:35:44.392319] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.245 [2024-07-12 22:35:44.393626] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.245 [2024-07-12 22:35:44.393653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:34.245 BaseBdev2 00:28:34.245 22:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:34.504 spare_malloc 00:28:34.504 22:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:34.763 spare_delay 00:28:34.763 22:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:35.022 [2024-07-12 22:35:45.123587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:35.022 [2024-07-12 22:35:45.123632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.022 [2024-07-12 22:35:45.123655] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f57a0 00:28:35.022 [2024-07-12 22:35:45.123668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.022 [2024-07-12 22:35:45.124986] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.022 [2024-07-12 22:35:45.125013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:35.022 spare 00:28:35.022 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:35.281 [2024-07-12 22:35:45.360230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:35.281 [2024-07-12 22:35:45.361387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:35.281 [2024-07-12 22:35:45.361548] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13f61c0 00:28:35.281 [2024-07-12 22:35:45.361561] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:35.281 [2024-07-12 22:35:45.361628] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1307360 00:28:35.281 [2024-07-12 22:35:45.361738] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13f61c0 00:28:35.281 [2024-07-12 22:35:45.361749] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13f61c0 00:28:35.281 [2024-07-12 22:35:45.361813] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.281 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.539 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.539 "name": "raid_bdev1", 00:28:35.539 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:35.539 "strip_size_kb": 0, 00:28:35.539 "state": "online", 00:28:35.539 "raid_level": "raid1", 00:28:35.539 "superblock": true, 00:28:35.539 "num_base_bdevs": 2, 00:28:35.539 "num_base_bdevs_discovered": 2, 00:28:35.539 "num_base_bdevs_operational": 2, 00:28:35.539 "base_bdevs_list": [ 00:28:35.539 { 00:28:35.539 "name": "BaseBdev1", 00:28:35.539 "uuid": "e38bd4bc-5d7f-5a00-b3d1-422005ba6113", 00:28:35.539 "is_configured": true, 00:28:35.539 "data_offset": 256, 00:28:35.539 "data_size": 7936 00:28:35.539 }, 00:28:35.539 { 00:28:35.539 "name": "BaseBdev2", 00:28:35.539 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:35.539 "is_configured": true, 00:28:35.539 "data_offset": 256, 00:28:35.539 "data_size": 7936 00:28:35.539 } 00:28:35.539 ] 00:28:35.539 }' 00:28:35.539 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.539 22:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:36.108 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:36.108 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:36.108 [2024-07-12 22:35:46.431307] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:36.368 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:36.627 [2024-07-12 22:35:46.792134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1307360 00:28:36.627 /dev/nbd0 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:36.627 1+0 records in 00:28:36.627 1+0 records out 00:28:36.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263856 s, 15.5 MB/s 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:36.627 22:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:37.563 7936+0 records in 00:28:37.563 7936+0 records out 00:28:37.563 32505856 bytes (33 MB, 31 MiB) copied, 0.750738 s, 43.3 MB/s 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:37.563 [2024-07-12 22:35:47.881082] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:37.563 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:37.821 22:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:37.821 [2024-07-12 22:35:48.121758] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:37.821 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.079 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.079 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.079 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.079 "name": "raid_bdev1", 00:28:38.079 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:38.079 "strip_size_kb": 0, 00:28:38.079 "state": "online", 00:28:38.079 "raid_level": "raid1", 00:28:38.079 "superblock": true, 00:28:38.079 "num_base_bdevs": 2, 00:28:38.079 "num_base_bdevs_discovered": 1, 00:28:38.079 "num_base_bdevs_operational": 1, 00:28:38.079 "base_bdevs_list": [ 00:28:38.079 { 00:28:38.079 "name": null, 00:28:38.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.079 "is_configured": false, 00:28:38.079 "data_offset": 256, 00:28:38.079 "data_size": 7936 00:28:38.079 }, 00:28:38.079 { 00:28:38.079 "name": "BaseBdev2", 00:28:38.079 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:38.079 "is_configured": true, 00:28:38.079 "data_offset": 256, 00:28:38.079 "data_size": 7936 00:28:38.079 } 00:28:38.079 ] 00:28:38.079 }' 00:28:38.079 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.079 22:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.013 22:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:39.013 [2024-07-12 22:35:49.232709] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:39.013 [2024-07-12 22:35:49.235035] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a0350 00:28:39.013 [2024-07-12 22:35:49.237322] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:39.013 22:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.041 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.299 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.299 "name": "raid_bdev1", 00:28:40.299 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:40.299 "strip_size_kb": 0, 00:28:40.299 "state": "online", 00:28:40.299 "raid_level": "raid1", 00:28:40.299 "superblock": true, 00:28:40.299 "num_base_bdevs": 2, 00:28:40.299 "num_base_bdevs_discovered": 2, 00:28:40.299 "num_base_bdevs_operational": 2, 00:28:40.299 "process": { 00:28:40.299 "type": "rebuild", 00:28:40.299 "target": "spare", 00:28:40.299 "progress": { 00:28:40.299 "blocks": 3072, 00:28:40.299 "percent": 38 00:28:40.299 } 00:28:40.299 }, 00:28:40.299 "base_bdevs_list": [ 00:28:40.299 { 00:28:40.299 "name": "spare", 00:28:40.299 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:40.299 "is_configured": true, 00:28:40.299 "data_offset": 256, 00:28:40.299 "data_size": 7936 00:28:40.299 }, 00:28:40.299 { 00:28:40.299 "name": "BaseBdev2", 00:28:40.299 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:40.299 "is_configured": true, 00:28:40.299 "data_offset": 256, 00:28:40.299 "data_size": 7936 00:28:40.299 } 00:28:40.299 ] 00:28:40.299 }' 00:28:40.299 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.299 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:40.299 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.557 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:40.557 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:40.557 [2024-07-12 22:35:50.850548] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.816 [2024-07-12 22:35:50.950930] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:40.816 [2024-07-12 22:35:50.950978] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:40.816 [2024-07-12 22:35:50.950995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:40.816 [2024-07-12 22:35:50.951004] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.816 22:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.075 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.075 "name": "raid_bdev1", 00:28:41.075 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:41.075 "strip_size_kb": 0, 00:28:41.075 "state": "online", 00:28:41.075 "raid_level": "raid1", 00:28:41.075 "superblock": true, 00:28:41.075 "num_base_bdevs": 2, 00:28:41.075 "num_base_bdevs_discovered": 1, 00:28:41.075 "num_base_bdevs_operational": 1, 00:28:41.075 "base_bdevs_list": [ 00:28:41.075 { 00:28:41.075 "name": null, 00:28:41.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.075 "is_configured": false, 00:28:41.075 "data_offset": 256, 00:28:41.075 "data_size": 7936 00:28:41.075 }, 00:28:41.075 { 00:28:41.075 "name": "BaseBdev2", 00:28:41.075 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:41.075 "is_configured": true, 00:28:41.075 "data_offset": 256, 00:28:41.075 "data_size": 7936 00:28:41.075 } 00:28:41.075 ] 00:28:41.075 }' 00:28:41.075 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.075 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.643 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.901 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:41.901 "name": "raid_bdev1", 00:28:41.901 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:41.901 "strip_size_kb": 0, 00:28:41.901 "state": "online", 00:28:41.901 "raid_level": "raid1", 00:28:41.901 "superblock": true, 00:28:41.901 "num_base_bdevs": 2, 00:28:41.901 "num_base_bdevs_discovered": 1, 00:28:41.901 "num_base_bdevs_operational": 1, 00:28:41.901 "base_bdevs_list": [ 00:28:41.901 { 00:28:41.901 "name": null, 00:28:41.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.901 "is_configured": false, 00:28:41.901 "data_offset": 256, 00:28:41.901 "data_size": 7936 00:28:41.901 }, 00:28:41.901 { 00:28:41.901 "name": "BaseBdev2", 00:28:41.901 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:41.901 "is_configured": true, 00:28:41.901 "data_offset": 256, 00:28:41.901 "data_size": 7936 00:28:41.901 } 00:28:41.901 ] 00:28:41.901 }' 00:28:41.901 22:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:41.901 22:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:41.901 22:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:41.901 22:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:41.901 22:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:42.160 [2024-07-12 22:35:52.286857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:42.160 [2024-07-12 22:35:52.289480] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a1280 00:28:42.160 [2024-07-12 22:35:52.291077] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:42.160 22:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.094 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.352 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.352 "name": "raid_bdev1", 00:28:43.352 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:43.352 "strip_size_kb": 0, 00:28:43.352 "state": "online", 00:28:43.352 "raid_level": "raid1", 00:28:43.352 "superblock": true, 00:28:43.352 "num_base_bdevs": 2, 00:28:43.352 "num_base_bdevs_discovered": 2, 00:28:43.352 "num_base_bdevs_operational": 2, 00:28:43.352 "process": { 00:28:43.352 "type": "rebuild", 00:28:43.352 "target": "spare", 00:28:43.352 "progress": { 00:28:43.352 "blocks": 3072, 00:28:43.352 "percent": 38 00:28:43.352 } 00:28:43.352 }, 00:28:43.352 "base_bdevs_list": [ 00:28:43.352 { 00:28:43.352 "name": "spare", 00:28:43.352 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:43.352 "is_configured": true, 00:28:43.352 "data_offset": 256, 00:28:43.352 "data_size": 7936 00:28:43.352 }, 00:28:43.352 { 00:28:43.352 "name": "BaseBdev2", 00:28:43.352 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:43.352 "is_configured": true, 00:28:43.352 "data_offset": 256, 00:28:43.352 "data_size": 7936 00:28:43.352 } 00:28:43.352 ] 00:28:43.352 }' 00:28:43.352 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.352 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:43.352 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:43.612 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1057 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.612 "name": "raid_bdev1", 00:28:43.612 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:43.612 "strip_size_kb": 0, 00:28:43.612 "state": "online", 00:28:43.612 "raid_level": "raid1", 00:28:43.612 "superblock": true, 00:28:43.612 "num_base_bdevs": 2, 00:28:43.612 "num_base_bdevs_discovered": 2, 00:28:43.612 "num_base_bdevs_operational": 2, 00:28:43.612 "process": { 00:28:43.612 "type": "rebuild", 00:28:43.612 "target": "spare", 00:28:43.612 "progress": { 00:28:43.612 "blocks": 4096, 00:28:43.612 "percent": 51 00:28:43.612 } 00:28:43.612 }, 00:28:43.612 "base_bdevs_list": [ 00:28:43.612 { 00:28:43.612 "name": "spare", 00:28:43.612 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:43.612 "is_configured": true, 00:28:43.612 "data_offset": 256, 00:28:43.612 "data_size": 7936 00:28:43.612 }, 00:28:43.612 { 00:28:43.612 "name": "BaseBdev2", 00:28:43.612 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:43.612 "is_configured": true, 00:28:43.612 "data_offset": 256, 00:28:43.612 "data_size": 7936 00:28:43.612 } 00:28:43.612 ] 00:28:43.612 }' 00:28:43.612 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.871 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:43.871 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.871 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:43.871 22:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:44.808 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:44.808 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.809 22:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:45.069 "name": "raid_bdev1", 00:28:45.069 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:45.069 "strip_size_kb": 0, 00:28:45.069 "state": "online", 00:28:45.069 "raid_level": "raid1", 00:28:45.069 "superblock": true, 00:28:45.069 "num_base_bdevs": 2, 00:28:45.069 "num_base_bdevs_discovered": 2, 00:28:45.069 "num_base_bdevs_operational": 2, 00:28:45.069 "process": { 00:28:45.069 "type": "rebuild", 00:28:45.069 "target": "spare", 00:28:45.069 "progress": { 00:28:45.069 "blocks": 7168, 00:28:45.069 "percent": 90 00:28:45.069 } 00:28:45.069 }, 00:28:45.069 "base_bdevs_list": [ 00:28:45.069 { 00:28:45.069 "name": "spare", 00:28:45.069 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:45.069 "is_configured": true, 00:28:45.069 "data_offset": 256, 00:28:45.069 "data_size": 7936 00:28:45.069 }, 00:28:45.069 { 00:28:45.069 "name": "BaseBdev2", 00:28:45.069 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:45.069 "is_configured": true, 00:28:45.069 "data_offset": 256, 00:28:45.069 "data_size": 7936 00:28:45.069 } 00:28:45.069 ] 00:28:45.069 }' 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:45.069 22:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:45.328 [2024-07-12 22:35:55.415491] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:45.328 [2024-07-12 22:35:55.415551] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:45.328 [2024-07-12 22:35:55.415635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.265 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:46.265 "name": "raid_bdev1", 00:28:46.265 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:46.265 "strip_size_kb": 0, 00:28:46.265 "state": "online", 00:28:46.265 "raid_level": "raid1", 00:28:46.265 "superblock": true, 00:28:46.265 "num_base_bdevs": 2, 00:28:46.265 "num_base_bdevs_discovered": 2, 00:28:46.266 "num_base_bdevs_operational": 2, 00:28:46.266 "base_bdevs_list": [ 00:28:46.266 { 00:28:46.266 "name": "spare", 00:28:46.266 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:46.266 "is_configured": true, 00:28:46.266 "data_offset": 256, 00:28:46.266 "data_size": 7936 00:28:46.266 }, 00:28:46.266 { 00:28:46.266 "name": "BaseBdev2", 00:28:46.266 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:46.266 "is_configured": true, 00:28:46.266 "data_offset": 256, 00:28:46.266 "data_size": 7936 00:28:46.266 } 00:28:46.266 ] 00:28:46.266 }' 00:28:46.266 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:46.266 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:46.266 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.525 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:46.525 "name": "raid_bdev1", 00:28:46.525 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:46.525 "strip_size_kb": 0, 00:28:46.525 "state": "online", 00:28:46.525 "raid_level": "raid1", 00:28:46.525 "superblock": true, 00:28:46.525 "num_base_bdevs": 2, 00:28:46.525 "num_base_bdevs_discovered": 2, 00:28:46.525 "num_base_bdevs_operational": 2, 00:28:46.525 "base_bdevs_list": [ 00:28:46.525 { 00:28:46.525 "name": "spare", 00:28:46.525 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:46.525 "is_configured": true, 00:28:46.525 "data_offset": 256, 00:28:46.525 "data_size": 7936 00:28:46.525 }, 00:28:46.525 { 00:28:46.525 "name": "BaseBdev2", 00:28:46.525 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:46.525 "is_configured": true, 00:28:46.525 "data_offset": 256, 00:28:46.525 "data_size": 7936 00:28:46.525 } 00:28:46.525 ] 00:28:46.525 }' 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.785 22:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.044 22:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.044 "name": "raid_bdev1", 00:28:47.044 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:47.044 "strip_size_kb": 0, 00:28:47.044 "state": "online", 00:28:47.044 "raid_level": "raid1", 00:28:47.044 "superblock": true, 00:28:47.044 "num_base_bdevs": 2, 00:28:47.044 "num_base_bdevs_discovered": 2, 00:28:47.044 "num_base_bdevs_operational": 2, 00:28:47.044 "base_bdevs_list": [ 00:28:47.044 { 00:28:47.044 "name": "spare", 00:28:47.044 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:47.044 "is_configured": true, 00:28:47.044 "data_offset": 256, 00:28:47.044 "data_size": 7936 00:28:47.044 }, 00:28:47.044 { 00:28:47.044 "name": "BaseBdev2", 00:28:47.044 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:47.044 "is_configured": true, 00:28:47.044 "data_offset": 256, 00:28:47.044 "data_size": 7936 00:28:47.044 } 00:28:47.044 ] 00:28:47.044 }' 00:28:47.044 22:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.044 22:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:47.612 22:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:47.871 [2024-07-12 22:35:58.009765] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:47.871 [2024-07-12 22:35:58.009795] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:47.871 [2024-07-12 22:35:58.009857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:47.871 [2024-07-12 22:35:58.009916] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:47.871 [2024-07-12 22:35:58.009934] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f61c0 name raid_bdev1, state offline 00:28:47.871 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.871 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:48.438 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:48.439 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:48.697 /dev/nbd0 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:48.697 1+0 records in 00:28:48.697 1+0 records out 00:28:48.697 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026756 s, 15.3 MB/s 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:48.697 22:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:48.956 /dev/nbd1 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:48.957 1+0 records in 00:28:48.957 1+0 records out 00:28:48.957 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282038 s, 14.5 MB/s 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:48.957 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:49.216 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:49.475 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:49.734 22:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:49.992 [2024-07-12 22:36:00.198645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:49.992 [2024-07-12 22:36:00.198699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:49.992 [2024-07-12 22:36:00.198722] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f59d0 00:28:49.992 [2024-07-12 22:36:00.198735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:49.992 [2024-07-12 22:36:00.200243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:49.992 [2024-07-12 22:36:00.200273] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:49.992 [2024-07-12 22:36:00.200338] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:49.992 [2024-07-12 22:36:00.200364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:49.992 [2024-07-12 22:36:00.200462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:49.992 spare 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.992 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.992 [2024-07-12 22:36:00.300774] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13077c0 00:28:49.992 [2024-07-12 22:36:00.300795] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:49.992 [2024-07-12 22:36:00.300879] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f7cd0 00:28:49.992 [2024-07-12 22:36:00.301021] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13077c0 00:28:49.992 [2024-07-12 22:36:00.301031] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13077c0 00:28:49.992 [2024-07-12 22:36:00.301116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:50.251 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.251 "name": "raid_bdev1", 00:28:50.251 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:50.251 "strip_size_kb": 0, 00:28:50.251 "state": "online", 00:28:50.251 "raid_level": "raid1", 00:28:50.251 "superblock": true, 00:28:50.251 "num_base_bdevs": 2, 00:28:50.251 "num_base_bdevs_discovered": 2, 00:28:50.251 "num_base_bdevs_operational": 2, 00:28:50.251 "base_bdevs_list": [ 00:28:50.251 { 00:28:50.251 "name": "spare", 00:28:50.251 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:50.251 "is_configured": true, 00:28:50.251 "data_offset": 256, 00:28:50.251 "data_size": 7936 00:28:50.251 }, 00:28:50.251 { 00:28:50.251 "name": "BaseBdev2", 00:28:50.251 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:50.251 "is_configured": true, 00:28:50.251 "data_offset": 256, 00:28:50.251 "data_size": 7936 00:28:50.251 } 00:28:50.251 ] 00:28:50.251 }' 00:28:50.251 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.251 22:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.817 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.076 "name": "raid_bdev1", 00:28:51.076 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:51.076 "strip_size_kb": 0, 00:28:51.076 "state": "online", 00:28:51.076 "raid_level": "raid1", 00:28:51.076 "superblock": true, 00:28:51.076 "num_base_bdevs": 2, 00:28:51.076 "num_base_bdevs_discovered": 2, 00:28:51.076 "num_base_bdevs_operational": 2, 00:28:51.076 "base_bdevs_list": [ 00:28:51.076 { 00:28:51.076 "name": "spare", 00:28:51.076 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:51.076 "is_configured": true, 00:28:51.076 "data_offset": 256, 00:28:51.076 "data_size": 7936 00:28:51.076 }, 00:28:51.076 { 00:28:51.076 "name": "BaseBdev2", 00:28:51.076 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:51.076 "is_configured": true, 00:28:51.076 "data_offset": 256, 00:28:51.076 "data_size": 7936 00:28:51.076 } 00:28:51.076 ] 00:28:51.076 }' 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.076 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:51.336 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:51.336 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:51.594 [2024-07-12 22:36:01.782976] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:51.594 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:51.594 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.594 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.595 22:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.853 22:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.853 "name": "raid_bdev1", 00:28:51.853 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:51.853 "strip_size_kb": 0, 00:28:51.853 "state": "online", 00:28:51.853 "raid_level": "raid1", 00:28:51.853 "superblock": true, 00:28:51.853 "num_base_bdevs": 2, 00:28:51.853 "num_base_bdevs_discovered": 1, 00:28:51.853 "num_base_bdevs_operational": 1, 00:28:51.853 "base_bdevs_list": [ 00:28:51.853 { 00:28:51.853 "name": null, 00:28:51.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.853 "is_configured": false, 00:28:51.853 "data_offset": 256, 00:28:51.853 "data_size": 7936 00:28:51.853 }, 00:28:51.853 { 00:28:51.853 "name": "BaseBdev2", 00:28:51.853 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:51.853 "is_configured": true, 00:28:51.853 "data_offset": 256, 00:28:51.853 "data_size": 7936 00:28:51.853 } 00:28:51.853 ] 00:28:51.853 }' 00:28:51.853 22:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.853 22:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:52.421 22:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:52.680 [2024-07-12 22:36:02.809710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:52.680 [2024-07-12 22:36:02.809883] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:52.680 [2024-07-12 22:36:02.809900] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:52.680 [2024-07-12 22:36:02.809938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:52.680 [2024-07-12 22:36:02.812152] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a01d0 00:28:52.680 [2024-07-12 22:36:02.813494] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:52.680 22:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.617 22:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.876 "name": "raid_bdev1", 00:28:53.876 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:53.876 "strip_size_kb": 0, 00:28:53.876 "state": "online", 00:28:53.876 "raid_level": "raid1", 00:28:53.876 "superblock": true, 00:28:53.876 "num_base_bdevs": 2, 00:28:53.876 "num_base_bdevs_discovered": 2, 00:28:53.876 "num_base_bdevs_operational": 2, 00:28:53.876 "process": { 00:28:53.876 "type": "rebuild", 00:28:53.876 "target": "spare", 00:28:53.876 "progress": { 00:28:53.876 "blocks": 3072, 00:28:53.876 "percent": 38 00:28:53.876 } 00:28:53.876 }, 00:28:53.876 "base_bdevs_list": [ 00:28:53.876 { 00:28:53.876 "name": "spare", 00:28:53.876 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:53.876 "is_configured": true, 00:28:53.876 "data_offset": 256, 00:28:53.876 "data_size": 7936 00:28:53.876 }, 00:28:53.876 { 00:28:53.876 "name": "BaseBdev2", 00:28:53.876 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:53.876 "is_configured": true, 00:28:53.876 "data_offset": 256, 00:28:53.876 "data_size": 7936 00:28:53.876 } 00:28:53.876 ] 00:28:53.876 }' 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:53.876 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:54.134 [2024-07-12 22:36:04.390894] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:54.134 [2024-07-12 22:36:04.426270] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:54.134 [2024-07-12 22:36:04.426324] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:54.134 [2024-07-12 22:36:04.426339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:54.134 [2024-07-12 22:36:04.426348] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.134 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.393 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.393 "name": "raid_bdev1", 00:28:54.393 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:54.393 "strip_size_kb": 0, 00:28:54.393 "state": "online", 00:28:54.393 "raid_level": "raid1", 00:28:54.393 "superblock": true, 00:28:54.393 "num_base_bdevs": 2, 00:28:54.393 "num_base_bdevs_discovered": 1, 00:28:54.393 "num_base_bdevs_operational": 1, 00:28:54.393 "base_bdevs_list": [ 00:28:54.393 { 00:28:54.393 "name": null, 00:28:54.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.393 "is_configured": false, 00:28:54.393 "data_offset": 256, 00:28:54.393 "data_size": 7936 00:28:54.393 }, 00:28:54.393 { 00:28:54.393 "name": "BaseBdev2", 00:28:54.393 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:54.393 "is_configured": true, 00:28:54.393 "data_offset": 256, 00:28:54.393 "data_size": 7936 00:28:54.393 } 00:28:54.393 ] 00:28:54.393 }' 00:28:54.393 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.393 22:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:55.329 22:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:55.329 [2024-07-12 22:36:05.516375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:55.329 [2024-07-12 22:36:05.516435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:55.329 [2024-07-12 22:36:05.516458] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x143a810 00:28:55.329 [2024-07-12 22:36:05.516471] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:55.329 [2024-07-12 22:36:05.516709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:55.329 [2024-07-12 22:36:05.516725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:55.329 [2024-07-12 22:36:05.516789] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:55.329 [2024-07-12 22:36:05.516801] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:55.329 [2024-07-12 22:36:05.516813] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:55.329 [2024-07-12 22:36:05.516832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:55.329 [2024-07-12 22:36:05.519071] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a0980 00:28:55.329 [2024-07-12 22:36:05.520408] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:55.329 spare 00:28:55.329 22:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.266 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.525 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:56.525 "name": "raid_bdev1", 00:28:56.525 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:56.525 "strip_size_kb": 0, 00:28:56.525 "state": "online", 00:28:56.525 "raid_level": "raid1", 00:28:56.525 "superblock": true, 00:28:56.525 "num_base_bdevs": 2, 00:28:56.525 "num_base_bdevs_discovered": 2, 00:28:56.525 "num_base_bdevs_operational": 2, 00:28:56.525 "process": { 00:28:56.525 "type": "rebuild", 00:28:56.525 "target": "spare", 00:28:56.525 "progress": { 00:28:56.525 "blocks": 3072, 00:28:56.525 "percent": 38 00:28:56.525 } 00:28:56.525 }, 00:28:56.525 "base_bdevs_list": [ 00:28:56.525 { 00:28:56.525 "name": "spare", 00:28:56.525 "uuid": "f9b07975-ed9e-53d9-91f4-c8a3941698eb", 00:28:56.525 "is_configured": true, 00:28:56.525 "data_offset": 256, 00:28:56.525 "data_size": 7936 00:28:56.525 }, 00:28:56.525 { 00:28:56.525 "name": "BaseBdev2", 00:28:56.525 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:56.525 "is_configured": true, 00:28:56.525 "data_offset": 256, 00:28:56.525 "data_size": 7936 00:28:56.525 } 00:28:56.525 ] 00:28:56.525 }' 00:28:56.525 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:56.525 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:56.525 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:56.784 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:56.784 22:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:56.784 [2024-07-12 22:36:07.105307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:57.043 [2024-07-12 22:36:07.132957] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:57.043 [2024-07-12 22:36:07.133013] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.043 [2024-07-12 22:36:07.133029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:57.043 [2024-07-12 22:36:07.133038] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.043 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.302 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.302 "name": "raid_bdev1", 00:28:57.302 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:57.302 "strip_size_kb": 0, 00:28:57.302 "state": "online", 00:28:57.302 "raid_level": "raid1", 00:28:57.302 "superblock": true, 00:28:57.302 "num_base_bdevs": 2, 00:28:57.302 "num_base_bdevs_discovered": 1, 00:28:57.302 "num_base_bdevs_operational": 1, 00:28:57.302 "base_bdevs_list": [ 00:28:57.302 { 00:28:57.302 "name": null, 00:28:57.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.302 "is_configured": false, 00:28:57.302 "data_offset": 256, 00:28:57.302 "data_size": 7936 00:28:57.302 }, 00:28:57.302 { 00:28:57.302 "name": "BaseBdev2", 00:28:57.302 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:57.302 "is_configured": true, 00:28:57.302 "data_offset": 256, 00:28:57.302 "data_size": 7936 00:28:57.302 } 00:28:57.302 ] 00:28:57.302 }' 00:28:57.302 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.302 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:57.870 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:57.870 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.870 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:57.870 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:57.870 22:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.870 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.870 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:58.129 "name": "raid_bdev1", 00:28:58.129 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:58.129 "strip_size_kb": 0, 00:28:58.129 "state": "online", 00:28:58.129 "raid_level": "raid1", 00:28:58.129 "superblock": true, 00:28:58.129 "num_base_bdevs": 2, 00:28:58.129 "num_base_bdevs_discovered": 1, 00:28:58.129 "num_base_bdevs_operational": 1, 00:28:58.129 "base_bdevs_list": [ 00:28:58.129 { 00:28:58.129 "name": null, 00:28:58.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.129 "is_configured": false, 00:28:58.129 "data_offset": 256, 00:28:58.129 "data_size": 7936 00:28:58.129 }, 00:28:58.129 { 00:28:58.129 "name": "BaseBdev2", 00:28:58.129 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:58.129 "is_configured": true, 00:28:58.129 "data_offset": 256, 00:28:58.129 "data_size": 7936 00:28:58.129 } 00:28:58.129 ] 00:28:58.129 }' 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:58.129 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:58.388 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:58.646 [2024-07-12 22:36:08.792572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:58.646 [2024-07-12 22:36:08.792624] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:58.646 [2024-07-12 22:36:08.792646] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f7010 00:28:58.646 [2024-07-12 22:36:08.792659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:58.646 [2024-07-12 22:36:08.792868] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:58.646 [2024-07-12 22:36:08.792885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:58.646 [2024-07-12 22:36:08.792944] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:58.646 [2024-07-12 22:36:08.792957] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:58.646 [2024-07-12 22:36:08.792968] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:58.646 BaseBdev1 00:28:58.646 22:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.603 22:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:59.936 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:59.936 "name": "raid_bdev1", 00:28:59.936 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:28:59.936 "strip_size_kb": 0, 00:28:59.936 "state": "online", 00:28:59.936 "raid_level": "raid1", 00:28:59.936 "superblock": true, 00:28:59.936 "num_base_bdevs": 2, 00:28:59.936 "num_base_bdevs_discovered": 1, 00:28:59.936 "num_base_bdevs_operational": 1, 00:28:59.936 "base_bdevs_list": [ 00:28:59.936 { 00:28:59.936 "name": null, 00:28:59.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.936 "is_configured": false, 00:28:59.936 "data_offset": 256, 00:28:59.936 "data_size": 7936 00:28:59.936 }, 00:28:59.936 { 00:28:59.936 "name": "BaseBdev2", 00:28:59.936 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:28:59.936 "is_configured": true, 00:28:59.936 "data_offset": 256, 00:28:59.936 "data_size": 7936 00:28:59.936 } 00:28:59.936 ] 00:28:59.936 }' 00:28:59.936 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:59.936 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.524 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.782 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.782 "name": "raid_bdev1", 00:29:00.782 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:29:00.782 "strip_size_kb": 0, 00:29:00.782 "state": "online", 00:29:00.782 "raid_level": "raid1", 00:29:00.782 "superblock": true, 00:29:00.782 "num_base_bdevs": 2, 00:29:00.782 "num_base_bdevs_discovered": 1, 00:29:00.782 "num_base_bdevs_operational": 1, 00:29:00.782 "base_bdevs_list": [ 00:29:00.782 { 00:29:00.782 "name": null, 00:29:00.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:00.782 "is_configured": false, 00:29:00.782 "data_offset": 256, 00:29:00.782 "data_size": 7936 00:29:00.782 }, 00:29:00.782 { 00:29:00.782 "name": "BaseBdev2", 00:29:00.782 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:29:00.782 "is_configured": true, 00:29:00.782 "data_offset": 256, 00:29:00.782 "data_size": 7936 00:29:00.782 } 00:29:00.782 ] 00:29:00.782 }' 00:29:00.782 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.782 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:00.782 22:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:00.782 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:01.040 [2024-07-12 22:36:11.239100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:01.040 [2024-07-12 22:36:11.239243] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:01.040 [2024-07-12 22:36:11.239260] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:01.040 request: 00:29:01.040 { 00:29:01.040 "base_bdev": "BaseBdev1", 00:29:01.040 "raid_bdev": "raid_bdev1", 00:29:01.040 "method": "bdev_raid_add_base_bdev", 00:29:01.040 "req_id": 1 00:29:01.040 } 00:29:01.040 Got JSON-RPC error response 00:29:01.040 response: 00:29:01.040 { 00:29:01.040 "code": -22, 00:29:01.040 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:01.040 } 00:29:01.040 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:29:01.040 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:01.040 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:01.040 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:01.040 22:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.972 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.229 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.229 "name": "raid_bdev1", 00:29:02.229 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:29:02.229 "strip_size_kb": 0, 00:29:02.229 "state": "online", 00:29:02.229 "raid_level": "raid1", 00:29:02.229 "superblock": true, 00:29:02.229 "num_base_bdevs": 2, 00:29:02.229 "num_base_bdevs_discovered": 1, 00:29:02.229 "num_base_bdevs_operational": 1, 00:29:02.229 "base_bdevs_list": [ 00:29:02.229 { 00:29:02.229 "name": null, 00:29:02.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:02.229 "is_configured": false, 00:29:02.229 "data_offset": 256, 00:29:02.229 "data_size": 7936 00:29:02.229 }, 00:29:02.229 { 00:29:02.229 "name": "BaseBdev2", 00:29:02.229 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:29:02.229 "is_configured": true, 00:29:02.229 "data_offset": 256, 00:29:02.229 "data_size": 7936 00:29:02.229 } 00:29:02.229 ] 00:29:02.229 }' 00:29:02.229 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.229 22:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:02.792 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:02.793 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:02.793 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:02.793 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:02.793 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:02.793 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.050 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.050 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:03.050 "name": "raid_bdev1", 00:29:03.050 "uuid": "c5bcb449-4449-49e9-adf5-0d6a339d5576", 00:29:03.050 "strip_size_kb": 0, 00:29:03.050 "state": "online", 00:29:03.050 "raid_level": "raid1", 00:29:03.050 "superblock": true, 00:29:03.050 "num_base_bdevs": 2, 00:29:03.050 "num_base_bdevs_discovered": 1, 00:29:03.050 "num_base_bdevs_operational": 1, 00:29:03.050 "base_bdevs_list": [ 00:29:03.050 { 00:29:03.050 "name": null, 00:29:03.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:03.050 "is_configured": false, 00:29:03.050 "data_offset": 256, 00:29:03.050 "data_size": 7936 00:29:03.050 }, 00:29:03.050 { 00:29:03.050 "name": "BaseBdev2", 00:29:03.050 "uuid": "b343bfff-2145-57ca-bcb6-82365d22f030", 00:29:03.050 "is_configured": true, 00:29:03.050 "data_offset": 256, 00:29:03.050 "data_size": 7936 00:29:03.050 } 00:29:03.050 ] 00:29:03.050 }' 00:29:03.050 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 3571748 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 3571748 ']' 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 3571748 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3571748 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3571748' 00:29:03.308 killing process with pid 3571748 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 3571748 00:29:03.308 Received shutdown signal, test time was about 60.000000 seconds 00:29:03.308 00:29:03.308 Latency(us) 00:29:03.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:03.308 =================================================================================================================== 00:29:03.308 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:03.308 [2024-07-12 22:36:13.499980] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:03.308 [2024-07-12 22:36:13.500081] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:03.308 [2024-07-12 22:36:13.500132] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:03.308 [2024-07-12 22:36:13.500151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13077c0 name raid_bdev1, state offline 00:29:03.308 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 3571748 00:29:03.308 [2024-07-12 22:36:13.537832] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:03.566 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:29:03.566 00:29:03.566 real 0m31.352s 00:29:03.566 user 0m48.872s 00:29:03.566 sys 0m5.018s 00:29:03.566 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:03.566 22:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:03.566 ************************************ 00:29:03.566 END TEST raid_rebuild_test_sb_md_separate 00:29:03.566 ************************************ 00:29:03.566 22:36:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:03.566 22:36:13 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:29:03.566 22:36:13 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:29:03.566 22:36:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:03.566 22:36:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:03.566 22:36:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:03.566 ************************************ 00:29:03.566 START TEST raid_state_function_test_sb_md_interleaved 00:29:03.566 ************************************ 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=3576741 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 3576741' 00:29:03.566 Process raid pid: 3576741 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 3576741 /var/tmp/spdk-raid.sock 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3576741 ']' 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:03.566 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:03.566 22:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:03.823 [2024-07-12 22:36:13.916693] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:29:03.823 [2024-07-12 22:36:13.916763] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:03.823 [2024-07-12 22:36:14.036766] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:03.823 [2024-07-12 22:36:14.142665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:04.081 [2024-07-12 22:36:14.211386] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.081 [2024-07-12 22:36:14.211422] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:04.644 22:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:04.644 22:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:04.644 22:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:04.902 [2024-07-12 22:36:14.985004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:04.902 [2024-07-12 22:36:14.985045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:04.902 [2024-07-12 22:36:14.985056] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:04.902 [2024-07-12 22:36:14.985068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.902 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:05.160 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:05.160 "name": "Existed_Raid", 00:29:05.160 "uuid": "bed25f91-5baa-461b-a21e-74182536b8ee", 00:29:05.160 "strip_size_kb": 0, 00:29:05.160 "state": "configuring", 00:29:05.160 "raid_level": "raid1", 00:29:05.160 "superblock": true, 00:29:05.160 "num_base_bdevs": 2, 00:29:05.160 "num_base_bdevs_discovered": 0, 00:29:05.160 "num_base_bdevs_operational": 2, 00:29:05.160 "base_bdevs_list": [ 00:29:05.160 { 00:29:05.160 "name": "BaseBdev1", 00:29:05.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:05.160 "is_configured": false, 00:29:05.160 "data_offset": 0, 00:29:05.160 "data_size": 0 00:29:05.160 }, 00:29:05.160 { 00:29:05.160 "name": "BaseBdev2", 00:29:05.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:05.160 "is_configured": false, 00:29:05.160 "data_offset": 0, 00:29:05.160 "data_size": 0 00:29:05.160 } 00:29:05.160 ] 00:29:05.160 }' 00:29:05.160 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:05.160 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:05.725 22:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:05.725 [2024-07-12 22:36:16.047687] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:05.725 [2024-07-12 22:36:16.047721] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfeba80 name Existed_Raid, state configuring 00:29:05.983 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:05.983 [2024-07-12 22:36:16.292354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:05.983 [2024-07-12 22:36:16.292385] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:05.983 [2024-07-12 22:36:16.292395] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:05.983 [2024-07-12 22:36:16.292406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:29:06.241 [2024-07-12 22:36:16.547075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:06.241 BaseBdev1 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:06.241 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:06.499 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:06.499 22:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:06.757 [ 00:29:06.757 { 00:29:06.757 "name": "BaseBdev1", 00:29:06.757 "aliases": [ 00:29:06.757 "d9a9f6e8-c127-4068-a349-8bf99b650604" 00:29:06.757 ], 00:29:06.757 "product_name": "Malloc disk", 00:29:06.757 "block_size": 4128, 00:29:06.757 "num_blocks": 8192, 00:29:06.757 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:06.757 "md_size": 32, 00:29:06.757 "md_interleave": true, 00:29:06.757 "dif_type": 0, 00:29:06.757 "assigned_rate_limits": { 00:29:06.757 "rw_ios_per_sec": 0, 00:29:06.757 "rw_mbytes_per_sec": 0, 00:29:06.757 "r_mbytes_per_sec": 0, 00:29:06.757 "w_mbytes_per_sec": 0 00:29:06.757 }, 00:29:06.757 "claimed": true, 00:29:06.757 "claim_type": "exclusive_write", 00:29:06.757 "zoned": false, 00:29:06.757 "supported_io_types": { 00:29:06.757 "read": true, 00:29:06.757 "write": true, 00:29:06.757 "unmap": true, 00:29:06.757 "flush": true, 00:29:06.757 "reset": true, 00:29:06.757 "nvme_admin": false, 00:29:06.757 "nvme_io": false, 00:29:06.757 "nvme_io_md": false, 00:29:06.757 "write_zeroes": true, 00:29:06.757 "zcopy": true, 00:29:06.757 "get_zone_info": false, 00:29:06.757 "zone_management": false, 00:29:06.757 "zone_append": false, 00:29:06.757 "compare": false, 00:29:06.757 "compare_and_write": false, 00:29:06.757 "abort": true, 00:29:06.757 "seek_hole": false, 00:29:06.757 "seek_data": false, 00:29:06.757 "copy": true, 00:29:06.757 "nvme_iov_md": false 00:29:06.757 }, 00:29:06.757 "memory_domains": [ 00:29:06.757 { 00:29:06.757 "dma_device_id": "system", 00:29:06.757 "dma_device_type": 1 00:29:06.757 }, 00:29:06.757 { 00:29:06.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:06.757 "dma_device_type": 2 00:29:06.757 } 00:29:06.757 ], 00:29:06.757 "driver_specific": {} 00:29:06.757 } 00:29:06.757 ] 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.757 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.758 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:07.016 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.016 "name": "Existed_Raid", 00:29:07.016 "uuid": "37864dd7-1103-4d68-8a18-c690c6c34d88", 00:29:07.016 "strip_size_kb": 0, 00:29:07.016 "state": "configuring", 00:29:07.016 "raid_level": "raid1", 00:29:07.016 "superblock": true, 00:29:07.016 "num_base_bdevs": 2, 00:29:07.016 "num_base_bdevs_discovered": 1, 00:29:07.016 "num_base_bdevs_operational": 2, 00:29:07.016 "base_bdevs_list": [ 00:29:07.016 { 00:29:07.016 "name": "BaseBdev1", 00:29:07.016 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:07.016 "is_configured": true, 00:29:07.016 "data_offset": 256, 00:29:07.016 "data_size": 7936 00:29:07.016 }, 00:29:07.016 { 00:29:07.016 "name": "BaseBdev2", 00:29:07.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.016 "is_configured": false, 00:29:07.016 "data_offset": 0, 00:29:07.016 "data_size": 0 00:29:07.016 } 00:29:07.016 ] 00:29:07.016 }' 00:29:07.016 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.016 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:07.583 22:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:07.840 [2024-07-12 22:36:18.115282] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:07.840 [2024-07-12 22:36:18.115324] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfeb350 name Existed_Raid, state configuring 00:29:07.840 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:08.099 [2024-07-12 22:36:18.359970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:08.099 [2024-07-12 22:36:18.361460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:08.099 [2024-07-12 22:36:18.361494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.099 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:08.357 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.357 "name": "Existed_Raid", 00:29:08.357 "uuid": "8558a978-ed10-4114-a748-3918de06a846", 00:29:08.357 "strip_size_kb": 0, 00:29:08.357 "state": "configuring", 00:29:08.357 "raid_level": "raid1", 00:29:08.357 "superblock": true, 00:29:08.357 "num_base_bdevs": 2, 00:29:08.357 "num_base_bdevs_discovered": 1, 00:29:08.357 "num_base_bdevs_operational": 2, 00:29:08.357 "base_bdevs_list": [ 00:29:08.357 { 00:29:08.357 "name": "BaseBdev1", 00:29:08.357 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:08.357 "is_configured": true, 00:29:08.357 "data_offset": 256, 00:29:08.357 "data_size": 7936 00:29:08.357 }, 00:29:08.357 { 00:29:08.357 "name": "BaseBdev2", 00:29:08.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:08.357 "is_configured": false, 00:29:08.357 "data_offset": 0, 00:29:08.357 "data_size": 0 00:29:08.357 } 00:29:08.357 ] 00:29:08.357 }' 00:29:08.357 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.357 22:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:08.922 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:29:09.180 [2024-07-12 22:36:19.402379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:09.180 [2024-07-12 22:36:19.402520] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfed180 00:29:09.180 [2024-07-12 22:36:19.402534] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:09.180 [2024-07-12 22:36:19.402594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfed150 00:29:09.180 [2024-07-12 22:36:19.402669] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfed180 00:29:09.180 [2024-07-12 22:36:19.402680] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfed180 00:29:09.180 [2024-07-12 22:36:19.402738] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:09.180 BaseBdev2 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:09.180 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:09.438 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:09.696 [ 00:29:09.696 { 00:29:09.696 "name": "BaseBdev2", 00:29:09.696 "aliases": [ 00:29:09.696 "13352002-a4ee-4dfa-a29b-dfc2be370fe0" 00:29:09.696 ], 00:29:09.696 "product_name": "Malloc disk", 00:29:09.696 "block_size": 4128, 00:29:09.696 "num_blocks": 8192, 00:29:09.696 "uuid": "13352002-a4ee-4dfa-a29b-dfc2be370fe0", 00:29:09.696 "md_size": 32, 00:29:09.696 "md_interleave": true, 00:29:09.696 "dif_type": 0, 00:29:09.696 "assigned_rate_limits": { 00:29:09.696 "rw_ios_per_sec": 0, 00:29:09.696 "rw_mbytes_per_sec": 0, 00:29:09.696 "r_mbytes_per_sec": 0, 00:29:09.696 "w_mbytes_per_sec": 0 00:29:09.696 }, 00:29:09.696 "claimed": true, 00:29:09.696 "claim_type": "exclusive_write", 00:29:09.696 "zoned": false, 00:29:09.696 "supported_io_types": { 00:29:09.696 "read": true, 00:29:09.696 "write": true, 00:29:09.696 "unmap": true, 00:29:09.696 "flush": true, 00:29:09.696 "reset": true, 00:29:09.696 "nvme_admin": false, 00:29:09.696 "nvme_io": false, 00:29:09.696 "nvme_io_md": false, 00:29:09.696 "write_zeroes": true, 00:29:09.696 "zcopy": true, 00:29:09.696 "get_zone_info": false, 00:29:09.696 "zone_management": false, 00:29:09.696 "zone_append": false, 00:29:09.696 "compare": false, 00:29:09.696 "compare_and_write": false, 00:29:09.696 "abort": true, 00:29:09.696 "seek_hole": false, 00:29:09.696 "seek_data": false, 00:29:09.696 "copy": true, 00:29:09.696 "nvme_iov_md": false 00:29:09.696 }, 00:29:09.696 "memory_domains": [ 00:29:09.696 { 00:29:09.696 "dma_device_id": "system", 00:29:09.696 "dma_device_type": 1 00:29:09.696 }, 00:29:09.696 { 00:29:09.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.696 "dma_device_type": 2 00:29:09.696 } 00:29:09.696 ], 00:29:09.696 "driver_specific": {} 00:29:09.696 } 00:29:09.696 ] 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.696 22:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:09.955 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:09.955 "name": "Existed_Raid", 00:29:09.955 "uuid": "8558a978-ed10-4114-a748-3918de06a846", 00:29:09.955 "strip_size_kb": 0, 00:29:09.955 "state": "online", 00:29:09.955 "raid_level": "raid1", 00:29:09.955 "superblock": true, 00:29:09.955 "num_base_bdevs": 2, 00:29:09.955 "num_base_bdevs_discovered": 2, 00:29:09.955 "num_base_bdevs_operational": 2, 00:29:09.955 "base_bdevs_list": [ 00:29:09.955 { 00:29:09.955 "name": "BaseBdev1", 00:29:09.955 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:09.955 "is_configured": true, 00:29:09.955 "data_offset": 256, 00:29:09.955 "data_size": 7936 00:29:09.955 }, 00:29:09.955 { 00:29:09.955 "name": "BaseBdev2", 00:29:09.955 "uuid": "13352002-a4ee-4dfa-a29b-dfc2be370fe0", 00:29:09.955 "is_configured": true, 00:29:09.955 "data_offset": 256, 00:29:09.955 "data_size": 7936 00:29:09.955 } 00:29:09.955 ] 00:29:09.955 }' 00:29:09.955 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:09.955 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:10.521 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:10.779 [2024-07-12 22:36:20.978886] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:10.779 22:36:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:10.779 "name": "Existed_Raid", 00:29:10.779 "aliases": [ 00:29:10.779 "8558a978-ed10-4114-a748-3918de06a846" 00:29:10.779 ], 00:29:10.779 "product_name": "Raid Volume", 00:29:10.779 "block_size": 4128, 00:29:10.779 "num_blocks": 7936, 00:29:10.779 "uuid": "8558a978-ed10-4114-a748-3918de06a846", 00:29:10.779 "md_size": 32, 00:29:10.779 "md_interleave": true, 00:29:10.779 "dif_type": 0, 00:29:10.779 "assigned_rate_limits": { 00:29:10.779 "rw_ios_per_sec": 0, 00:29:10.779 "rw_mbytes_per_sec": 0, 00:29:10.779 "r_mbytes_per_sec": 0, 00:29:10.779 "w_mbytes_per_sec": 0 00:29:10.779 }, 00:29:10.779 "claimed": false, 00:29:10.779 "zoned": false, 00:29:10.779 "supported_io_types": { 00:29:10.779 "read": true, 00:29:10.779 "write": true, 00:29:10.779 "unmap": false, 00:29:10.779 "flush": false, 00:29:10.779 "reset": true, 00:29:10.779 "nvme_admin": false, 00:29:10.779 "nvme_io": false, 00:29:10.779 "nvme_io_md": false, 00:29:10.779 "write_zeroes": true, 00:29:10.779 "zcopy": false, 00:29:10.779 "get_zone_info": false, 00:29:10.779 "zone_management": false, 00:29:10.779 "zone_append": false, 00:29:10.779 "compare": false, 00:29:10.779 "compare_and_write": false, 00:29:10.779 "abort": false, 00:29:10.779 "seek_hole": false, 00:29:10.779 "seek_data": false, 00:29:10.779 "copy": false, 00:29:10.779 "nvme_iov_md": false 00:29:10.779 }, 00:29:10.779 "memory_domains": [ 00:29:10.779 { 00:29:10.779 "dma_device_id": "system", 00:29:10.779 "dma_device_type": 1 00:29:10.779 }, 00:29:10.779 { 00:29:10.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:10.779 "dma_device_type": 2 00:29:10.779 }, 00:29:10.779 { 00:29:10.779 "dma_device_id": "system", 00:29:10.779 "dma_device_type": 1 00:29:10.779 }, 00:29:10.779 { 00:29:10.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:10.779 "dma_device_type": 2 00:29:10.779 } 00:29:10.779 ], 00:29:10.779 "driver_specific": { 00:29:10.779 "raid": { 00:29:10.779 "uuid": "8558a978-ed10-4114-a748-3918de06a846", 00:29:10.779 "strip_size_kb": 0, 00:29:10.779 "state": "online", 00:29:10.779 "raid_level": "raid1", 00:29:10.779 "superblock": true, 00:29:10.779 "num_base_bdevs": 2, 00:29:10.779 "num_base_bdevs_discovered": 2, 00:29:10.779 "num_base_bdevs_operational": 2, 00:29:10.779 "base_bdevs_list": [ 00:29:10.779 { 00:29:10.779 "name": "BaseBdev1", 00:29:10.779 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:10.779 "is_configured": true, 00:29:10.779 "data_offset": 256, 00:29:10.779 "data_size": 7936 00:29:10.779 }, 00:29:10.779 { 00:29:10.779 "name": "BaseBdev2", 00:29:10.779 "uuid": "13352002-a4ee-4dfa-a29b-dfc2be370fe0", 00:29:10.779 "is_configured": true, 00:29:10.779 "data_offset": 256, 00:29:10.779 "data_size": 7936 00:29:10.779 } 00:29:10.779 ] 00:29:10.779 } 00:29:10.779 } 00:29:10.779 }' 00:29:10.779 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:10.779 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:10.779 BaseBdev2' 00:29:10.779 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:10.779 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:10.779 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:11.037 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:11.037 "name": "BaseBdev1", 00:29:11.037 "aliases": [ 00:29:11.037 "d9a9f6e8-c127-4068-a349-8bf99b650604" 00:29:11.037 ], 00:29:11.037 "product_name": "Malloc disk", 00:29:11.037 "block_size": 4128, 00:29:11.037 "num_blocks": 8192, 00:29:11.037 "uuid": "d9a9f6e8-c127-4068-a349-8bf99b650604", 00:29:11.037 "md_size": 32, 00:29:11.037 "md_interleave": true, 00:29:11.037 "dif_type": 0, 00:29:11.037 "assigned_rate_limits": { 00:29:11.037 "rw_ios_per_sec": 0, 00:29:11.037 "rw_mbytes_per_sec": 0, 00:29:11.037 "r_mbytes_per_sec": 0, 00:29:11.037 "w_mbytes_per_sec": 0 00:29:11.037 }, 00:29:11.037 "claimed": true, 00:29:11.037 "claim_type": "exclusive_write", 00:29:11.037 "zoned": false, 00:29:11.037 "supported_io_types": { 00:29:11.037 "read": true, 00:29:11.037 "write": true, 00:29:11.037 "unmap": true, 00:29:11.037 "flush": true, 00:29:11.037 "reset": true, 00:29:11.037 "nvme_admin": false, 00:29:11.037 "nvme_io": false, 00:29:11.037 "nvme_io_md": false, 00:29:11.037 "write_zeroes": true, 00:29:11.037 "zcopy": true, 00:29:11.037 "get_zone_info": false, 00:29:11.037 "zone_management": false, 00:29:11.037 "zone_append": false, 00:29:11.037 "compare": false, 00:29:11.038 "compare_and_write": false, 00:29:11.038 "abort": true, 00:29:11.038 "seek_hole": false, 00:29:11.038 "seek_data": false, 00:29:11.038 "copy": true, 00:29:11.038 "nvme_iov_md": false 00:29:11.038 }, 00:29:11.038 "memory_domains": [ 00:29:11.038 { 00:29:11.038 "dma_device_id": "system", 00:29:11.038 "dma_device_type": 1 00:29:11.038 }, 00:29:11.038 { 00:29:11.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:11.038 "dma_device_type": 2 00:29:11.038 } 00:29:11.038 ], 00:29:11.038 "driver_specific": {} 00:29:11.038 }' 00:29:11.038 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:11.038 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:11.297 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:11.556 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:11.556 "name": "BaseBdev2", 00:29:11.556 "aliases": [ 00:29:11.556 "13352002-a4ee-4dfa-a29b-dfc2be370fe0" 00:29:11.556 ], 00:29:11.556 "product_name": "Malloc disk", 00:29:11.556 "block_size": 4128, 00:29:11.556 "num_blocks": 8192, 00:29:11.556 "uuid": "13352002-a4ee-4dfa-a29b-dfc2be370fe0", 00:29:11.556 "md_size": 32, 00:29:11.556 "md_interleave": true, 00:29:11.556 "dif_type": 0, 00:29:11.556 "assigned_rate_limits": { 00:29:11.556 "rw_ios_per_sec": 0, 00:29:11.556 "rw_mbytes_per_sec": 0, 00:29:11.556 "r_mbytes_per_sec": 0, 00:29:11.556 "w_mbytes_per_sec": 0 00:29:11.556 }, 00:29:11.556 "claimed": true, 00:29:11.556 "claim_type": "exclusive_write", 00:29:11.556 "zoned": false, 00:29:11.556 "supported_io_types": { 00:29:11.556 "read": true, 00:29:11.556 "write": true, 00:29:11.556 "unmap": true, 00:29:11.556 "flush": true, 00:29:11.556 "reset": true, 00:29:11.556 "nvme_admin": false, 00:29:11.556 "nvme_io": false, 00:29:11.556 "nvme_io_md": false, 00:29:11.556 "write_zeroes": true, 00:29:11.556 "zcopy": true, 00:29:11.556 "get_zone_info": false, 00:29:11.556 "zone_management": false, 00:29:11.556 "zone_append": false, 00:29:11.556 "compare": false, 00:29:11.556 "compare_and_write": false, 00:29:11.556 "abort": true, 00:29:11.556 "seek_hole": false, 00:29:11.556 "seek_data": false, 00:29:11.556 "copy": true, 00:29:11.556 "nvme_iov_md": false 00:29:11.556 }, 00:29:11.556 "memory_domains": [ 00:29:11.556 { 00:29:11.556 "dma_device_id": "system", 00:29:11.556 "dma_device_type": 1 00:29:11.556 }, 00:29:11.556 { 00:29:11.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:11.556 "dma_device_type": 2 00:29:11.556 } 00:29:11.556 ], 00:29:11.556 "driver_specific": {} 00:29:11.556 }' 00:29:11.556 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:11.556 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:11.556 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:11.556 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:11.815 22:36:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:11.815 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:11.815 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:11.815 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:12.074 [2024-07-12 22:36:22.306155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.074 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:12.334 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.334 "name": "Existed_Raid", 00:29:12.334 "uuid": "8558a978-ed10-4114-a748-3918de06a846", 00:29:12.334 "strip_size_kb": 0, 00:29:12.334 "state": "online", 00:29:12.334 "raid_level": "raid1", 00:29:12.334 "superblock": true, 00:29:12.334 "num_base_bdevs": 2, 00:29:12.334 "num_base_bdevs_discovered": 1, 00:29:12.334 "num_base_bdevs_operational": 1, 00:29:12.334 "base_bdevs_list": [ 00:29:12.334 { 00:29:12.334 "name": null, 00:29:12.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.334 "is_configured": false, 00:29:12.334 "data_offset": 256, 00:29:12.334 "data_size": 7936 00:29:12.334 }, 00:29:12.334 { 00:29:12.334 "name": "BaseBdev2", 00:29:12.334 "uuid": "13352002-a4ee-4dfa-a29b-dfc2be370fe0", 00:29:12.334 "is_configured": true, 00:29:12.334 "data_offset": 256, 00:29:12.334 "data_size": 7936 00:29:12.334 } 00:29:12.334 ] 00:29:12.334 }' 00:29:12.334 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.334 22:36:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:12.901 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:12.901 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:12.901 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.901 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:13.159 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:13.159 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:13.159 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:13.417 [2024-07-12 22:36:23.638785] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:13.417 [2024-07-12 22:36:23.638879] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:13.417 [2024-07-12 22:36:23.652032] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:13.417 [2024-07-12 22:36:23.652075] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:13.417 [2024-07-12 22:36:23.652087] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfed180 name Existed_Raid, state offline 00:29:13.417 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:13.417 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:13.417 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.417 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 3576741 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3576741 ']' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3576741 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3576741 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3576741' 00:29:13.676 killing process with pid 3576741 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 3576741 00:29:13.676 [2024-07-12 22:36:23.956109] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:13.676 22:36:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 3576741 00:29:13.676 [2024-07-12 22:36:23.957096] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:13.935 22:36:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:13.935 00:29:13.935 real 0m10.330s 00:29:13.935 user 0m18.370s 00:29:13.935 sys 0m1.940s 00:29:13.935 22:36:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:13.935 22:36:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:13.935 ************************************ 00:29:13.935 END TEST raid_state_function_test_sb_md_interleaved 00:29:13.935 ************************************ 00:29:13.935 22:36:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:13.935 22:36:24 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:13.935 22:36:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:13.935 22:36:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:13.935 22:36:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:13.935 ************************************ 00:29:13.935 START TEST raid_superblock_test_md_interleaved 00:29:13.935 ************************************ 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:13.935 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=3578268 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 3578268 /var/tmp/spdk-raid.sock 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3578268 ']' 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:14.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:14.193 22:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:14.193 [2024-07-12 22:36:24.310538] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:29:14.193 [2024-07-12 22:36:24.310600] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3578268 ] 00:29:14.193 [2024-07-12 22:36:24.439877] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:14.452 [2024-07-12 22:36:24.544391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:14.452 [2024-07-12 22:36:24.598482] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:14.452 [2024-07-12 22:36:24.598515] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:15.018 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:15.277 malloc1 00:29:15.277 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:15.536 [2024-07-12 22:36:25.728437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:15.536 [2024-07-12 22:36:25.728489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:15.536 [2024-07-12 22:36:25.728512] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12d24e0 00:29:15.536 [2024-07-12 22:36:25.728525] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:15.536 [2024-07-12 22:36:25.730101] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:15.536 [2024-07-12 22:36:25.730129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:15.536 pt1 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:15.536 22:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:15.795 malloc2 00:29:15.795 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:16.054 [2024-07-12 22:36:26.228124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:16.054 [2024-07-12 22:36:26.228171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:16.054 [2024-07-12 22:36:26.228191] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b7570 00:29:16.054 [2024-07-12 22:36:26.228204] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:16.054 [2024-07-12 22:36:26.229712] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:16.054 [2024-07-12 22:36:26.229740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:16.054 pt2 00:29:16.054 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:16.054 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:16.054 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:16.313 [2024-07-12 22:36:26.472790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:16.313 [2024-07-12 22:36:26.474319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:16.313 [2024-07-12 22:36:26.474483] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b8f20 00:29:16.313 [2024-07-12 22:36:26.474496] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:16.313 [2024-07-12 22:36:26.474567] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1135050 00:29:16.313 [2024-07-12 22:36:26.474656] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b8f20 00:29:16.313 [2024-07-12 22:36:26.474666] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b8f20 00:29:16.313 [2024-07-12 22:36:26.474729] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.313 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.573 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:16.573 "name": "raid_bdev1", 00:29:16.573 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:16.573 "strip_size_kb": 0, 00:29:16.573 "state": "online", 00:29:16.573 "raid_level": "raid1", 00:29:16.573 "superblock": true, 00:29:16.573 "num_base_bdevs": 2, 00:29:16.573 "num_base_bdevs_discovered": 2, 00:29:16.573 "num_base_bdevs_operational": 2, 00:29:16.573 "base_bdevs_list": [ 00:29:16.573 { 00:29:16.573 "name": "pt1", 00:29:16.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:16.573 "is_configured": true, 00:29:16.573 "data_offset": 256, 00:29:16.573 "data_size": 7936 00:29:16.573 }, 00:29:16.573 { 00:29:16.573 "name": "pt2", 00:29:16.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:16.573 "is_configured": true, 00:29:16.573 "data_offset": 256, 00:29:16.573 "data_size": 7936 00:29:16.573 } 00:29:16.573 ] 00:29:16.573 }' 00:29:16.573 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:16.573 22:36:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:17.140 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:17.399 [2024-07-12 22:36:27.563938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:17.399 "name": "raid_bdev1", 00:29:17.399 "aliases": [ 00:29:17.399 "851a7167-1ee5-4cf7-beaa-2e5d45946158" 00:29:17.399 ], 00:29:17.399 "product_name": "Raid Volume", 00:29:17.399 "block_size": 4128, 00:29:17.399 "num_blocks": 7936, 00:29:17.399 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:17.399 "md_size": 32, 00:29:17.399 "md_interleave": true, 00:29:17.399 "dif_type": 0, 00:29:17.399 "assigned_rate_limits": { 00:29:17.399 "rw_ios_per_sec": 0, 00:29:17.399 "rw_mbytes_per_sec": 0, 00:29:17.399 "r_mbytes_per_sec": 0, 00:29:17.399 "w_mbytes_per_sec": 0 00:29:17.399 }, 00:29:17.399 "claimed": false, 00:29:17.399 "zoned": false, 00:29:17.399 "supported_io_types": { 00:29:17.399 "read": true, 00:29:17.399 "write": true, 00:29:17.399 "unmap": false, 00:29:17.399 "flush": false, 00:29:17.399 "reset": true, 00:29:17.399 "nvme_admin": false, 00:29:17.399 "nvme_io": false, 00:29:17.399 "nvme_io_md": false, 00:29:17.399 "write_zeroes": true, 00:29:17.399 "zcopy": false, 00:29:17.399 "get_zone_info": false, 00:29:17.399 "zone_management": false, 00:29:17.399 "zone_append": false, 00:29:17.399 "compare": false, 00:29:17.399 "compare_and_write": false, 00:29:17.399 "abort": false, 00:29:17.399 "seek_hole": false, 00:29:17.399 "seek_data": false, 00:29:17.399 "copy": false, 00:29:17.399 "nvme_iov_md": false 00:29:17.399 }, 00:29:17.399 "memory_domains": [ 00:29:17.399 { 00:29:17.399 "dma_device_id": "system", 00:29:17.399 "dma_device_type": 1 00:29:17.399 }, 00:29:17.399 { 00:29:17.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:17.399 "dma_device_type": 2 00:29:17.399 }, 00:29:17.399 { 00:29:17.399 "dma_device_id": "system", 00:29:17.399 "dma_device_type": 1 00:29:17.399 }, 00:29:17.399 { 00:29:17.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:17.399 "dma_device_type": 2 00:29:17.399 } 00:29:17.399 ], 00:29:17.399 "driver_specific": { 00:29:17.399 "raid": { 00:29:17.399 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:17.399 "strip_size_kb": 0, 00:29:17.399 "state": "online", 00:29:17.399 "raid_level": "raid1", 00:29:17.399 "superblock": true, 00:29:17.399 "num_base_bdevs": 2, 00:29:17.399 "num_base_bdevs_discovered": 2, 00:29:17.399 "num_base_bdevs_operational": 2, 00:29:17.399 "base_bdevs_list": [ 00:29:17.399 { 00:29:17.399 "name": "pt1", 00:29:17.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:17.399 "is_configured": true, 00:29:17.399 "data_offset": 256, 00:29:17.399 "data_size": 7936 00:29:17.399 }, 00:29:17.399 { 00:29:17.399 "name": "pt2", 00:29:17.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.399 "is_configured": true, 00:29:17.399 "data_offset": 256, 00:29:17.399 "data_size": 7936 00:29:17.399 } 00:29:17.399 ] 00:29:17.399 } 00:29:17.399 } 00:29:17.399 }' 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:17.399 pt2' 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:17.399 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:17.658 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:17.658 "name": "pt1", 00:29:17.658 "aliases": [ 00:29:17.658 "00000000-0000-0000-0000-000000000001" 00:29:17.658 ], 00:29:17.658 "product_name": "passthru", 00:29:17.658 "block_size": 4128, 00:29:17.658 "num_blocks": 8192, 00:29:17.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:17.658 "md_size": 32, 00:29:17.658 "md_interleave": true, 00:29:17.658 "dif_type": 0, 00:29:17.658 "assigned_rate_limits": { 00:29:17.658 "rw_ios_per_sec": 0, 00:29:17.659 "rw_mbytes_per_sec": 0, 00:29:17.659 "r_mbytes_per_sec": 0, 00:29:17.659 "w_mbytes_per_sec": 0 00:29:17.659 }, 00:29:17.659 "claimed": true, 00:29:17.659 "claim_type": "exclusive_write", 00:29:17.659 "zoned": false, 00:29:17.659 "supported_io_types": { 00:29:17.659 "read": true, 00:29:17.659 "write": true, 00:29:17.659 "unmap": true, 00:29:17.659 "flush": true, 00:29:17.659 "reset": true, 00:29:17.659 "nvme_admin": false, 00:29:17.659 "nvme_io": false, 00:29:17.659 "nvme_io_md": false, 00:29:17.659 "write_zeroes": true, 00:29:17.659 "zcopy": true, 00:29:17.659 "get_zone_info": false, 00:29:17.659 "zone_management": false, 00:29:17.659 "zone_append": false, 00:29:17.659 "compare": false, 00:29:17.659 "compare_and_write": false, 00:29:17.659 "abort": true, 00:29:17.659 "seek_hole": false, 00:29:17.659 "seek_data": false, 00:29:17.659 "copy": true, 00:29:17.659 "nvme_iov_md": false 00:29:17.659 }, 00:29:17.659 "memory_domains": [ 00:29:17.659 { 00:29:17.659 "dma_device_id": "system", 00:29:17.659 "dma_device_type": 1 00:29:17.659 }, 00:29:17.659 { 00:29:17.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:17.659 "dma_device_type": 2 00:29:17.659 } 00:29:17.659 ], 00:29:17.659 "driver_specific": { 00:29:17.659 "passthru": { 00:29:17.659 "name": "pt1", 00:29:17.659 "base_bdev_name": "malloc1" 00:29:17.659 } 00:29:17.659 } 00:29:17.659 }' 00:29:17.659 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:17.659 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:17.659 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:17.659 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:17.918 22:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:17.918 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:18.177 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:18.177 "name": "pt2", 00:29:18.177 "aliases": [ 00:29:18.177 "00000000-0000-0000-0000-000000000002" 00:29:18.177 ], 00:29:18.177 "product_name": "passthru", 00:29:18.177 "block_size": 4128, 00:29:18.177 "num_blocks": 8192, 00:29:18.177 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:18.177 "md_size": 32, 00:29:18.177 "md_interleave": true, 00:29:18.177 "dif_type": 0, 00:29:18.177 "assigned_rate_limits": { 00:29:18.177 "rw_ios_per_sec": 0, 00:29:18.177 "rw_mbytes_per_sec": 0, 00:29:18.177 "r_mbytes_per_sec": 0, 00:29:18.177 "w_mbytes_per_sec": 0 00:29:18.177 }, 00:29:18.177 "claimed": true, 00:29:18.177 "claim_type": "exclusive_write", 00:29:18.177 "zoned": false, 00:29:18.177 "supported_io_types": { 00:29:18.177 "read": true, 00:29:18.177 "write": true, 00:29:18.177 "unmap": true, 00:29:18.177 "flush": true, 00:29:18.177 "reset": true, 00:29:18.177 "nvme_admin": false, 00:29:18.177 "nvme_io": false, 00:29:18.177 "nvme_io_md": false, 00:29:18.177 "write_zeroes": true, 00:29:18.177 "zcopy": true, 00:29:18.177 "get_zone_info": false, 00:29:18.177 "zone_management": false, 00:29:18.177 "zone_append": false, 00:29:18.177 "compare": false, 00:29:18.177 "compare_and_write": false, 00:29:18.177 "abort": true, 00:29:18.177 "seek_hole": false, 00:29:18.177 "seek_data": false, 00:29:18.177 "copy": true, 00:29:18.177 "nvme_iov_md": false 00:29:18.177 }, 00:29:18.177 "memory_domains": [ 00:29:18.177 { 00:29:18.177 "dma_device_id": "system", 00:29:18.177 "dma_device_type": 1 00:29:18.177 }, 00:29:18.177 { 00:29:18.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:18.177 "dma_device_type": 2 00:29:18.177 } 00:29:18.177 ], 00:29:18.177 "driver_specific": { 00:29:18.177 "passthru": { 00:29:18.177 "name": "pt2", 00:29:18.177 "base_bdev_name": "malloc2" 00:29:18.177 } 00:29:18.177 } 00:29:18.177 }' 00:29:18.177 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:18.435 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:18.750 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:18.750 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:18.750 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:18.750 22:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:18.750 [2024-07-12 22:36:29.043841] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:18.750 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=851a7167-1ee5-4cf7-beaa-2e5d45946158 00:29:18.750 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 851a7167-1ee5-4cf7-beaa-2e5d45946158 ']' 00:29:18.750 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:19.009 [2024-07-12 22:36:29.288239] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:19.009 [2024-07-12 22:36:29.288264] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:19.009 [2024-07-12 22:36:29.288325] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:19.009 [2024-07-12 22:36:29.288381] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:19.009 [2024-07-12 22:36:29.288393] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b8f20 name raid_bdev1, state offline 00:29:19.010 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.010 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:19.269 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:19.269 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:19.269 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:19.269 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:19.527 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:19.527 22:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:19.784 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:19.784 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.042 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:20.043 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:20.301 [2024-07-12 22:36:30.519461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:20.301 [2024-07-12 22:36:30.520855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:20.301 [2024-07-12 22:36:30.520914] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:20.301 [2024-07-12 22:36:30.520962] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:20.301 [2024-07-12 22:36:30.520982] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:20.301 [2024-07-12 22:36:30.520992] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c3260 name raid_bdev1, state configuring 00:29:20.301 request: 00:29:20.301 { 00:29:20.301 "name": "raid_bdev1", 00:29:20.301 "raid_level": "raid1", 00:29:20.301 "base_bdevs": [ 00:29:20.301 "malloc1", 00:29:20.301 "malloc2" 00:29:20.301 ], 00:29:20.301 "superblock": false, 00:29:20.301 "method": "bdev_raid_create", 00:29:20.301 "req_id": 1 00:29:20.301 } 00:29:20.301 Got JSON-RPC error response 00:29:20.301 response: 00:29:20.301 { 00:29:20.301 "code": -17, 00:29:20.301 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:20.301 } 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.301 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:20.559 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:20.559 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:20.559 22:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:20.817 [2024-07-12 22:36:31.012700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:20.817 [2024-07-12 22:36:31.012751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:20.817 [2024-07-12 22:36:31.012770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ba000 00:29:20.817 [2024-07-12 22:36:31.012789] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:20.817 [2024-07-12 22:36:31.014229] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:20.817 [2024-07-12 22:36:31.014255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:20.817 [2024-07-12 22:36:31.014303] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:20.817 [2024-07-12 22:36:31.014329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:20.817 pt1 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.817 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.075 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.075 "name": "raid_bdev1", 00:29:21.075 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:21.075 "strip_size_kb": 0, 00:29:21.075 "state": "configuring", 00:29:21.075 "raid_level": "raid1", 00:29:21.075 "superblock": true, 00:29:21.075 "num_base_bdevs": 2, 00:29:21.075 "num_base_bdevs_discovered": 1, 00:29:21.075 "num_base_bdevs_operational": 2, 00:29:21.075 "base_bdevs_list": [ 00:29:21.075 { 00:29:21.075 "name": "pt1", 00:29:21.075 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:21.075 "is_configured": true, 00:29:21.075 "data_offset": 256, 00:29:21.075 "data_size": 7936 00:29:21.075 }, 00:29:21.075 { 00:29:21.075 "name": null, 00:29:21.075 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:21.075 "is_configured": false, 00:29:21.075 "data_offset": 256, 00:29:21.075 "data_size": 7936 00:29:21.075 } 00:29:21.075 ] 00:29:21.075 }' 00:29:21.075 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.075 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:21.640 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:21.640 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:21.640 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:21.640 22:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:21.899 [2024-07-12 22:36:32.139686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:21.899 [2024-07-12 22:36:32.139740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.899 [2024-07-12 22:36:32.139761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12bc270 00:29:21.899 [2024-07-12 22:36:32.139774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.899 [2024-07-12 22:36:32.139960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.899 [2024-07-12 22:36:32.139977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:21.899 [2024-07-12 22:36:32.140030] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:21.899 [2024-07-12 22:36:32.140049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:21.899 [2024-07-12 22:36:32.140132] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1135c10 00:29:21.899 [2024-07-12 22:36:32.140143] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:21.899 [2024-07-12 22:36:32.140200] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b7d40 00:29:21.899 [2024-07-12 22:36:32.140274] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1135c10 00:29:21.899 [2024-07-12 22:36:32.140283] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1135c10 00:29:21.899 [2024-07-12 22:36:32.140340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:21.899 pt2 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.899 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.157 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:22.157 "name": "raid_bdev1", 00:29:22.157 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:22.157 "strip_size_kb": 0, 00:29:22.157 "state": "online", 00:29:22.157 "raid_level": "raid1", 00:29:22.157 "superblock": true, 00:29:22.157 "num_base_bdevs": 2, 00:29:22.157 "num_base_bdevs_discovered": 2, 00:29:22.157 "num_base_bdevs_operational": 2, 00:29:22.157 "base_bdevs_list": [ 00:29:22.157 { 00:29:22.157 "name": "pt1", 00:29:22.157 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:22.157 "is_configured": true, 00:29:22.157 "data_offset": 256, 00:29:22.157 "data_size": 7936 00:29:22.157 }, 00:29:22.157 { 00:29:22.157 "name": "pt2", 00:29:22.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:22.157 "is_configured": true, 00:29:22.157 "data_offset": 256, 00:29:22.157 "data_size": 7936 00:29:22.157 } 00:29:22.157 ] 00:29:22.157 }' 00:29:22.157 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:22.157 22:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:22.723 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:22.980 [2024-07-12 22:36:33.234836] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:22.980 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:22.980 "name": "raid_bdev1", 00:29:22.980 "aliases": [ 00:29:22.980 "851a7167-1ee5-4cf7-beaa-2e5d45946158" 00:29:22.980 ], 00:29:22.980 "product_name": "Raid Volume", 00:29:22.980 "block_size": 4128, 00:29:22.980 "num_blocks": 7936, 00:29:22.980 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:22.980 "md_size": 32, 00:29:22.980 "md_interleave": true, 00:29:22.980 "dif_type": 0, 00:29:22.980 "assigned_rate_limits": { 00:29:22.980 "rw_ios_per_sec": 0, 00:29:22.980 "rw_mbytes_per_sec": 0, 00:29:22.980 "r_mbytes_per_sec": 0, 00:29:22.980 "w_mbytes_per_sec": 0 00:29:22.980 }, 00:29:22.980 "claimed": false, 00:29:22.980 "zoned": false, 00:29:22.980 "supported_io_types": { 00:29:22.980 "read": true, 00:29:22.980 "write": true, 00:29:22.980 "unmap": false, 00:29:22.980 "flush": false, 00:29:22.980 "reset": true, 00:29:22.980 "nvme_admin": false, 00:29:22.980 "nvme_io": false, 00:29:22.980 "nvme_io_md": false, 00:29:22.980 "write_zeroes": true, 00:29:22.980 "zcopy": false, 00:29:22.980 "get_zone_info": false, 00:29:22.981 "zone_management": false, 00:29:22.981 "zone_append": false, 00:29:22.981 "compare": false, 00:29:22.981 "compare_and_write": false, 00:29:22.981 "abort": false, 00:29:22.981 "seek_hole": false, 00:29:22.981 "seek_data": false, 00:29:22.981 "copy": false, 00:29:22.981 "nvme_iov_md": false 00:29:22.981 }, 00:29:22.981 "memory_domains": [ 00:29:22.981 { 00:29:22.981 "dma_device_id": "system", 00:29:22.981 "dma_device_type": 1 00:29:22.981 }, 00:29:22.981 { 00:29:22.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:22.981 "dma_device_type": 2 00:29:22.981 }, 00:29:22.981 { 00:29:22.981 "dma_device_id": "system", 00:29:22.981 "dma_device_type": 1 00:29:22.981 }, 00:29:22.981 { 00:29:22.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:22.981 "dma_device_type": 2 00:29:22.981 } 00:29:22.981 ], 00:29:22.981 "driver_specific": { 00:29:22.981 "raid": { 00:29:22.981 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:22.981 "strip_size_kb": 0, 00:29:22.981 "state": "online", 00:29:22.981 "raid_level": "raid1", 00:29:22.981 "superblock": true, 00:29:22.981 "num_base_bdevs": 2, 00:29:22.981 "num_base_bdevs_discovered": 2, 00:29:22.981 "num_base_bdevs_operational": 2, 00:29:22.981 "base_bdevs_list": [ 00:29:22.981 { 00:29:22.981 "name": "pt1", 00:29:22.981 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:22.981 "is_configured": true, 00:29:22.981 "data_offset": 256, 00:29:22.981 "data_size": 7936 00:29:22.981 }, 00:29:22.981 { 00:29:22.981 "name": "pt2", 00:29:22.981 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:22.981 "is_configured": true, 00:29:22.981 "data_offset": 256, 00:29:22.981 "data_size": 7936 00:29:22.981 } 00:29:22.981 ] 00:29:22.981 } 00:29:22.981 } 00:29:22.981 }' 00:29:22.981 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:22.981 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:22.981 pt2' 00:29:22.981 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:23.239 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:23.239 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:23.239 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:23.239 "name": "pt1", 00:29:23.239 "aliases": [ 00:29:23.239 "00000000-0000-0000-0000-000000000001" 00:29:23.239 ], 00:29:23.239 "product_name": "passthru", 00:29:23.239 "block_size": 4128, 00:29:23.239 "num_blocks": 8192, 00:29:23.239 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:23.239 "md_size": 32, 00:29:23.239 "md_interleave": true, 00:29:23.239 "dif_type": 0, 00:29:23.239 "assigned_rate_limits": { 00:29:23.239 "rw_ios_per_sec": 0, 00:29:23.239 "rw_mbytes_per_sec": 0, 00:29:23.239 "r_mbytes_per_sec": 0, 00:29:23.239 "w_mbytes_per_sec": 0 00:29:23.239 }, 00:29:23.239 "claimed": true, 00:29:23.239 "claim_type": "exclusive_write", 00:29:23.239 "zoned": false, 00:29:23.239 "supported_io_types": { 00:29:23.239 "read": true, 00:29:23.239 "write": true, 00:29:23.239 "unmap": true, 00:29:23.239 "flush": true, 00:29:23.239 "reset": true, 00:29:23.239 "nvme_admin": false, 00:29:23.239 "nvme_io": false, 00:29:23.239 "nvme_io_md": false, 00:29:23.239 "write_zeroes": true, 00:29:23.240 "zcopy": true, 00:29:23.240 "get_zone_info": false, 00:29:23.240 "zone_management": false, 00:29:23.240 "zone_append": false, 00:29:23.240 "compare": false, 00:29:23.240 "compare_and_write": false, 00:29:23.240 "abort": true, 00:29:23.240 "seek_hole": false, 00:29:23.240 "seek_data": false, 00:29:23.240 "copy": true, 00:29:23.240 "nvme_iov_md": false 00:29:23.240 }, 00:29:23.240 "memory_domains": [ 00:29:23.240 { 00:29:23.240 "dma_device_id": "system", 00:29:23.240 "dma_device_type": 1 00:29:23.240 }, 00:29:23.240 { 00:29:23.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:23.240 "dma_device_type": 2 00:29:23.240 } 00:29:23.240 ], 00:29:23.240 "driver_specific": { 00:29:23.240 "passthru": { 00:29:23.240 "name": "pt1", 00:29:23.240 "base_bdev_name": "malloc1" 00:29:23.240 } 00:29:23.240 } 00:29:23.240 }' 00:29:23.240 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:23.240 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:23.240 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:23.240 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:23.499 22:36:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:23.758 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:23.758 "name": "pt2", 00:29:23.758 "aliases": [ 00:29:23.758 "00000000-0000-0000-0000-000000000002" 00:29:23.758 ], 00:29:23.758 "product_name": "passthru", 00:29:23.758 "block_size": 4128, 00:29:23.758 "num_blocks": 8192, 00:29:23.758 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:23.758 "md_size": 32, 00:29:23.758 "md_interleave": true, 00:29:23.758 "dif_type": 0, 00:29:23.758 "assigned_rate_limits": { 00:29:23.758 "rw_ios_per_sec": 0, 00:29:23.758 "rw_mbytes_per_sec": 0, 00:29:23.758 "r_mbytes_per_sec": 0, 00:29:23.758 "w_mbytes_per_sec": 0 00:29:23.758 }, 00:29:23.758 "claimed": true, 00:29:23.758 "claim_type": "exclusive_write", 00:29:23.758 "zoned": false, 00:29:23.758 "supported_io_types": { 00:29:23.758 "read": true, 00:29:23.758 "write": true, 00:29:23.758 "unmap": true, 00:29:23.758 "flush": true, 00:29:23.758 "reset": true, 00:29:23.758 "nvme_admin": false, 00:29:23.758 "nvme_io": false, 00:29:23.758 "nvme_io_md": false, 00:29:23.758 "write_zeroes": true, 00:29:23.758 "zcopy": true, 00:29:23.758 "get_zone_info": false, 00:29:23.758 "zone_management": false, 00:29:23.758 "zone_append": false, 00:29:23.758 "compare": false, 00:29:23.758 "compare_and_write": false, 00:29:23.758 "abort": true, 00:29:23.758 "seek_hole": false, 00:29:23.758 "seek_data": false, 00:29:23.758 "copy": true, 00:29:23.758 "nvme_iov_md": false 00:29:23.758 }, 00:29:23.758 "memory_domains": [ 00:29:23.758 { 00:29:23.758 "dma_device_id": "system", 00:29:23.758 "dma_device_type": 1 00:29:23.758 }, 00:29:23.758 { 00:29:23.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:23.758 "dma_device_type": 2 00:29:23.758 } 00:29:23.758 ], 00:29:23.758 "driver_specific": { 00:29:23.758 "passthru": { 00:29:23.758 "name": "pt2", 00:29:23.758 "base_bdev_name": "malloc2" 00:29:23.758 } 00:29:23.758 } 00:29:23.758 }' 00:29:23.759 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:24.018 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:24.278 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:24.278 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:24.278 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:24.278 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:24.537 [2024-07-12 22:36:34.642589] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:24.537 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 851a7167-1ee5-4cf7-beaa-2e5d45946158 '!=' 851a7167-1ee5-4cf7-beaa-2e5d45946158 ']' 00:29:24.537 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:24.537 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:24.537 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:24.537 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:24.796 [2024-07-12 22:36:34.878974] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.796 22:36:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.055 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.055 "name": "raid_bdev1", 00:29:25.055 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:25.055 "strip_size_kb": 0, 00:29:25.055 "state": "online", 00:29:25.055 "raid_level": "raid1", 00:29:25.055 "superblock": true, 00:29:25.055 "num_base_bdevs": 2, 00:29:25.055 "num_base_bdevs_discovered": 1, 00:29:25.055 "num_base_bdevs_operational": 1, 00:29:25.055 "base_bdevs_list": [ 00:29:25.055 { 00:29:25.055 "name": null, 00:29:25.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.055 "is_configured": false, 00:29:25.055 "data_offset": 256, 00:29:25.055 "data_size": 7936 00:29:25.055 }, 00:29:25.055 { 00:29:25.055 "name": "pt2", 00:29:25.055 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:25.055 "is_configured": true, 00:29:25.055 "data_offset": 256, 00:29:25.055 "data_size": 7936 00:29:25.055 } 00:29:25.055 ] 00:29:25.055 }' 00:29:25.055 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.055 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:25.623 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:25.882 [2024-07-12 22:36:35.949788] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:25.882 [2024-07-12 22:36:35.949815] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:25.882 [2024-07-12 22:36:35.949869] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:25.882 [2024-07-12 22:36:35.949913] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:25.882 [2024-07-12 22:36:35.949933] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1135c10 name raid_bdev1, state offline 00:29:25.882 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.882 22:36:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:25.882 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:25.882 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:25.882 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:25.882 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:25.882 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:26.142 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:26.401 [2024-07-12 22:36:36.671673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:26.401 [2024-07-12 22:36:36.671725] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:26.401 [2024-07-12 22:36:36.671745] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ba9f0 00:29:26.401 [2024-07-12 22:36:36.671757] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:26.401 [2024-07-12 22:36:36.673242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:26.401 [2024-07-12 22:36:36.673270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:26.401 [2024-07-12 22:36:36.673321] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:26.401 [2024-07-12 22:36:36.673348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:26.401 [2024-07-12 22:36:36.673419] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12bbea0 00:29:26.401 [2024-07-12 22:36:36.673429] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:26.401 [2024-07-12 22:36:36.673487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b9bc0 00:29:26.401 [2024-07-12 22:36:36.673560] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12bbea0 00:29:26.401 [2024-07-12 22:36:36.673571] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12bbea0 00:29:26.401 [2024-07-12 22:36:36.673639] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:26.401 pt2 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.401 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:26.661 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:26.661 "name": "raid_bdev1", 00:29:26.661 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:26.661 "strip_size_kb": 0, 00:29:26.661 "state": "online", 00:29:26.661 "raid_level": "raid1", 00:29:26.661 "superblock": true, 00:29:26.661 "num_base_bdevs": 2, 00:29:26.661 "num_base_bdevs_discovered": 1, 00:29:26.661 "num_base_bdevs_operational": 1, 00:29:26.661 "base_bdevs_list": [ 00:29:26.661 { 00:29:26.661 "name": null, 00:29:26.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.661 "is_configured": false, 00:29:26.661 "data_offset": 256, 00:29:26.661 "data_size": 7936 00:29:26.661 }, 00:29:26.661 { 00:29:26.661 "name": "pt2", 00:29:26.661 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:26.661 "is_configured": true, 00:29:26.661 "data_offset": 256, 00:29:26.661 "data_size": 7936 00:29:26.661 } 00:29:26.661 ] 00:29:26.661 }' 00:29:26.661 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:26.661 22:36:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:27.228 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:27.487 [2024-07-12 22:36:37.690364] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:27.487 [2024-07-12 22:36:37.690393] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:27.487 [2024-07-12 22:36:37.690444] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:27.487 [2024-07-12 22:36:37.690491] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:27.487 [2024-07-12 22:36:37.690503] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bbea0 name raid_bdev1, state offline 00:29:27.487 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.487 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:27.745 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:27.745 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:27.745 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:27.745 22:36:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:28.004 [2024-07-12 22:36:38.183647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:28.004 [2024-07-12 22:36:38.183695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:28.004 [2024-07-12 22:36:38.183712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ba620 00:29:28.004 [2024-07-12 22:36:38.183724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:28.004 [2024-07-12 22:36:38.185149] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:28.004 [2024-07-12 22:36:38.185176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:28.004 [2024-07-12 22:36:38.185224] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:28.004 [2024-07-12 22:36:38.185250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:28.004 [2024-07-12 22:36:38.185329] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:28.004 [2024-07-12 22:36:38.185343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:28.004 [2024-07-12 22:36:38.185357] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bc640 name raid_bdev1, state configuring 00:29:28.004 [2024-07-12 22:36:38.185380] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:28.004 [2024-07-12 22:36:38.185432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12bc640 00:29:28.004 [2024-07-12 22:36:38.185443] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:28.004 [2024-07-12 22:36:38.185496] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12bb810 00:29:28.004 [2024-07-12 22:36:38.185567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12bc640 00:29:28.004 [2024-07-12 22:36:38.185577] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12bc640 00:29:28.004 [2024-07-12 22:36:38.185636] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:28.004 pt1 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.004 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.263 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.263 "name": "raid_bdev1", 00:29:28.263 "uuid": "851a7167-1ee5-4cf7-beaa-2e5d45946158", 00:29:28.263 "strip_size_kb": 0, 00:29:28.263 "state": "online", 00:29:28.263 "raid_level": "raid1", 00:29:28.263 "superblock": true, 00:29:28.263 "num_base_bdevs": 2, 00:29:28.263 "num_base_bdevs_discovered": 1, 00:29:28.263 "num_base_bdevs_operational": 1, 00:29:28.263 "base_bdevs_list": [ 00:29:28.263 { 00:29:28.263 "name": null, 00:29:28.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.263 "is_configured": false, 00:29:28.263 "data_offset": 256, 00:29:28.263 "data_size": 7936 00:29:28.263 }, 00:29:28.263 { 00:29:28.263 "name": "pt2", 00:29:28.263 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:28.263 "is_configured": true, 00:29:28.263 "data_offset": 256, 00:29:28.263 "data_size": 7936 00:29:28.263 } 00:29:28.263 ] 00:29:28.263 }' 00:29:28.263 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.263 22:36:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:28.837 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:28.837 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:29.098 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:29.098 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:29.098 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:29.357 [2024-07-12 22:36:39.523460] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 851a7167-1ee5-4cf7-beaa-2e5d45946158 '!=' 851a7167-1ee5-4cf7-beaa-2e5d45946158 ']' 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 3578268 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3578268 ']' 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3578268 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3578268 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3578268' 00:29:29.357 killing process with pid 3578268 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 3578268 00:29:29.357 [2024-07-12 22:36:39.592202] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:29.357 [2024-07-12 22:36:39.592259] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:29.357 [2024-07-12 22:36:39.592308] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:29.357 [2024-07-12 22:36:39.592320] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bc640 name raid_bdev1, state offline 00:29:29.357 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 3578268 00:29:29.357 [2024-07-12 22:36:39.612035] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:29.616 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:29.616 00:29:29.616 real 0m15.581s 00:29:29.616 user 0m28.284s 00:29:29.616 sys 0m2.861s 00:29:29.616 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:29.616 22:36:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.616 ************************************ 00:29:29.616 END TEST raid_superblock_test_md_interleaved 00:29:29.616 ************************************ 00:29:29.616 22:36:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:29.616 22:36:39 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:29.616 22:36:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:29.616 22:36:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:29.616 22:36:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:29.616 ************************************ 00:29:29.616 START TEST raid_rebuild_test_sb_md_interleaved 00:29:29.616 ************************************ 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:29.616 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:29.617 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=3580636 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 3580636 /var/tmp/spdk-raid.sock 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 3580636 ']' 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:29.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:29.875 22:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.875 [2024-07-12 22:36:40.005343] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:29:29.875 [2024-07-12 22:36:40.005416] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3580636 ] 00:29:29.875 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:29.875 Zero copy mechanism will not be used. 00:29:29.875 [2024-07-12 22:36:40.136976] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:30.134 [2024-07-12 22:36:40.239311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.134 [2024-07-12 22:36:40.303388] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:30.134 [2024-07-12 22:36:40.303428] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:30.705 22:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:30.705 22:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:30.706 22:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:30.706 22:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:30.964 BaseBdev1_malloc 00:29:30.964 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:31.223 [2024-07-12 22:36:41.338097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:31.223 [2024-07-12 22:36:41.338145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:31.223 [2024-07-12 22:36:41.338171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd03ce0 00:29:31.223 [2024-07-12 22:36:41.338184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:31.223 [2024-07-12 22:36:41.339689] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:31.223 [2024-07-12 22:36:41.339719] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:31.223 BaseBdev1 00:29:31.223 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:31.223 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:31.223 BaseBdev2_malloc 00:29:31.223 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:31.482 [2024-07-12 22:36:41.688167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:31.482 [2024-07-12 22:36:41.688214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:31.482 [2024-07-12 22:36:41.688238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcfb2d0 00:29:31.482 [2024-07-12 22:36:41.688251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:31.482 [2024-07-12 22:36:41.689921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:31.482 [2024-07-12 22:36:41.689957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:31.482 BaseBdev2 00:29:31.482 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:31.741 spare_malloc 00:29:31.741 22:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:32.000 spare_delay 00:29:32.000 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:32.259 [2024-07-12 22:36:42.350772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:32.259 [2024-07-12 22:36:42.350817] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:32.259 [2024-07-12 22:36:42.350840] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcfe070 00:29:32.259 [2024-07-12 22:36:42.350852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:32.259 [2024-07-12 22:36:42.352133] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:32.259 [2024-07-12 22:36:42.352160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:32.259 spare 00:29:32.259 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:32.518 [2024-07-12 22:36:42.595443] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:32.518 [2024-07-12 22:36:42.596607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:32.518 [2024-07-12 22:36:42.596765] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd00370 00:29:32.518 [2024-07-12 22:36:42.596779] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:32.518 [2024-07-12 22:36:42.596843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb669c0 00:29:32.518 [2024-07-12 22:36:42.596936] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd00370 00:29:32.518 [2024-07-12 22:36:42.596946] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd00370 00:29:32.518 [2024-07-12 22:36:42.597000] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:32.518 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.777 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:32.777 "name": "raid_bdev1", 00:29:32.777 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:32.777 "strip_size_kb": 0, 00:29:32.777 "state": "online", 00:29:32.777 "raid_level": "raid1", 00:29:32.777 "superblock": true, 00:29:32.777 "num_base_bdevs": 2, 00:29:32.777 "num_base_bdevs_discovered": 2, 00:29:32.777 "num_base_bdevs_operational": 2, 00:29:32.777 "base_bdevs_list": [ 00:29:32.777 { 00:29:32.777 "name": "BaseBdev1", 00:29:32.777 "uuid": "8cf8e7e3-c866-55f5-880e-c4e6d0e72fe0", 00:29:32.777 "is_configured": true, 00:29:32.777 "data_offset": 256, 00:29:32.777 "data_size": 7936 00:29:32.777 }, 00:29:32.777 { 00:29:32.777 "name": "BaseBdev2", 00:29:32.777 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:32.777 "is_configured": true, 00:29:32.777 "data_offset": 256, 00:29:32.777 "data_size": 7936 00:29:32.777 } 00:29:32.777 ] 00:29:32.777 }' 00:29:32.777 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:32.777 22:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:33.343 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:33.343 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:33.602 [2024-07-12 22:36:43.694563] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:33.602 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:33.602 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:33.602 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.860 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:33.860 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:33.860 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:29:33.860 22:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:34.119 [2024-07-12 22:36:44.191616] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.119 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.378 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.378 "name": "raid_bdev1", 00:29:34.378 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:34.378 "strip_size_kb": 0, 00:29:34.378 "state": "online", 00:29:34.378 "raid_level": "raid1", 00:29:34.378 "superblock": true, 00:29:34.378 "num_base_bdevs": 2, 00:29:34.378 "num_base_bdevs_discovered": 1, 00:29:34.378 "num_base_bdevs_operational": 1, 00:29:34.378 "base_bdevs_list": [ 00:29:34.378 { 00:29:34.378 "name": null, 00:29:34.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:34.378 "is_configured": false, 00:29:34.378 "data_offset": 256, 00:29:34.378 "data_size": 7936 00:29:34.378 }, 00:29:34.378 { 00:29:34.378 "name": "BaseBdev2", 00:29:34.378 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:34.378 "is_configured": true, 00:29:34.378 "data_offset": 256, 00:29:34.378 "data_size": 7936 00:29:34.378 } 00:29:34.378 ] 00:29:34.378 }' 00:29:34.378 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.378 22:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:34.944 22:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:34.944 [2024-07-12 22:36:45.254450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:34.944 [2024-07-12 22:36:45.258089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd00250 00:29:34.944 [2024-07-12 22:36:45.260093] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:35.203 22:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.165 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:36.422 "name": "raid_bdev1", 00:29:36.422 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:36.422 "strip_size_kb": 0, 00:29:36.422 "state": "online", 00:29:36.422 "raid_level": "raid1", 00:29:36.422 "superblock": true, 00:29:36.422 "num_base_bdevs": 2, 00:29:36.422 "num_base_bdevs_discovered": 2, 00:29:36.422 "num_base_bdevs_operational": 2, 00:29:36.422 "process": { 00:29:36.422 "type": "rebuild", 00:29:36.422 "target": "spare", 00:29:36.422 "progress": { 00:29:36.422 "blocks": 3072, 00:29:36.422 "percent": 38 00:29:36.422 } 00:29:36.422 }, 00:29:36.422 "base_bdevs_list": [ 00:29:36.422 { 00:29:36.422 "name": "spare", 00:29:36.422 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:36.422 "is_configured": true, 00:29:36.422 "data_offset": 256, 00:29:36.422 "data_size": 7936 00:29:36.422 }, 00:29:36.422 { 00:29:36.422 "name": "BaseBdev2", 00:29:36.422 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:36.422 "is_configured": true, 00:29:36.422 "data_offset": 256, 00:29:36.422 "data_size": 7936 00:29:36.422 } 00:29:36.422 ] 00:29:36.422 }' 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:36.422 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:36.681 [2024-07-12 22:36:46.857035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:36.681 [2024-07-12 22:36:46.872932] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:36.681 [2024-07-12 22:36:46.872988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:36.681 [2024-07-12 22:36:46.873004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:36.681 [2024-07-12 22:36:46.873013] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.681 22:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:36.939 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:36.939 "name": "raid_bdev1", 00:29:36.939 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:36.939 "strip_size_kb": 0, 00:29:36.939 "state": "online", 00:29:36.939 "raid_level": "raid1", 00:29:36.939 "superblock": true, 00:29:36.939 "num_base_bdevs": 2, 00:29:36.939 "num_base_bdevs_discovered": 1, 00:29:36.939 "num_base_bdevs_operational": 1, 00:29:36.939 "base_bdevs_list": [ 00:29:36.939 { 00:29:36.939 "name": null, 00:29:36.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:36.939 "is_configured": false, 00:29:36.939 "data_offset": 256, 00:29:36.939 "data_size": 7936 00:29:36.939 }, 00:29:36.939 { 00:29:36.939 "name": "BaseBdev2", 00:29:36.939 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:36.939 "is_configured": true, 00:29:36.939 "data_offset": 256, 00:29:36.939 "data_size": 7936 00:29:36.939 } 00:29:36.939 ] 00:29:36.939 }' 00:29:36.939 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:36.939 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:37.506 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.507 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.780 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:37.780 "name": "raid_bdev1", 00:29:37.780 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:37.780 "strip_size_kb": 0, 00:29:37.780 "state": "online", 00:29:37.780 "raid_level": "raid1", 00:29:37.780 "superblock": true, 00:29:37.780 "num_base_bdevs": 2, 00:29:37.780 "num_base_bdevs_discovered": 1, 00:29:37.780 "num_base_bdevs_operational": 1, 00:29:37.780 "base_bdevs_list": [ 00:29:37.780 { 00:29:37.780 "name": null, 00:29:37.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.780 "is_configured": false, 00:29:37.780 "data_offset": 256, 00:29:37.780 "data_size": 7936 00:29:37.780 }, 00:29:37.780 { 00:29:37.780 "name": "BaseBdev2", 00:29:37.780 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:37.780 "is_configured": true, 00:29:37.780 "data_offset": 256, 00:29:37.780 "data_size": 7936 00:29:37.780 } 00:29:37.780 ] 00:29:37.780 }' 00:29:37.780 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:37.780 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:37.780 22:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:37.780 22:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:37.780 22:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:38.052 [2024-07-12 22:36:48.236984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:38.052 [2024-07-12 22:36:48.240647] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcfc270 00:29:38.052 [2024-07-12 22:36:48.242086] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:38.052 22:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.988 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.248 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:39.248 "name": "raid_bdev1", 00:29:39.248 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:39.248 "strip_size_kb": 0, 00:29:39.248 "state": "online", 00:29:39.248 "raid_level": "raid1", 00:29:39.248 "superblock": true, 00:29:39.248 "num_base_bdevs": 2, 00:29:39.248 "num_base_bdevs_discovered": 2, 00:29:39.248 "num_base_bdevs_operational": 2, 00:29:39.248 "process": { 00:29:39.248 "type": "rebuild", 00:29:39.248 "target": "spare", 00:29:39.248 "progress": { 00:29:39.248 "blocks": 3072, 00:29:39.248 "percent": 38 00:29:39.248 } 00:29:39.248 }, 00:29:39.248 "base_bdevs_list": [ 00:29:39.248 { 00:29:39.248 "name": "spare", 00:29:39.248 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:39.248 "is_configured": true, 00:29:39.248 "data_offset": 256, 00:29:39.248 "data_size": 7936 00:29:39.248 }, 00:29:39.248 { 00:29:39.248 "name": "BaseBdev2", 00:29:39.248 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:39.248 "is_configured": true, 00:29:39.248 "data_offset": 256, 00:29:39.248 "data_size": 7936 00:29:39.248 } 00:29:39.248 ] 00:29:39.248 }' 00:29:39.248 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:39.248 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:39.248 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:39.508 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1113 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.508 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:39.767 "name": "raid_bdev1", 00:29:39.767 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:39.767 "strip_size_kb": 0, 00:29:39.767 "state": "online", 00:29:39.767 "raid_level": "raid1", 00:29:39.767 "superblock": true, 00:29:39.767 "num_base_bdevs": 2, 00:29:39.767 "num_base_bdevs_discovered": 2, 00:29:39.767 "num_base_bdevs_operational": 2, 00:29:39.767 "process": { 00:29:39.767 "type": "rebuild", 00:29:39.767 "target": "spare", 00:29:39.767 "progress": { 00:29:39.767 "blocks": 3840, 00:29:39.767 "percent": 48 00:29:39.767 } 00:29:39.767 }, 00:29:39.767 "base_bdevs_list": [ 00:29:39.767 { 00:29:39.767 "name": "spare", 00:29:39.767 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:39.767 "is_configured": true, 00:29:39.767 "data_offset": 256, 00:29:39.767 "data_size": 7936 00:29:39.767 }, 00:29:39.767 { 00:29:39.767 "name": "BaseBdev2", 00:29:39.767 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:39.767 "is_configured": true, 00:29:39.767 "data_offset": 256, 00:29:39.767 "data_size": 7936 00:29:39.767 } 00:29:39.767 ] 00:29:39.767 }' 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:39.767 22:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.705 22:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.964 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:40.964 "name": "raid_bdev1", 00:29:40.964 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:40.964 "strip_size_kb": 0, 00:29:40.964 "state": "online", 00:29:40.964 "raid_level": "raid1", 00:29:40.964 "superblock": true, 00:29:40.964 "num_base_bdevs": 2, 00:29:40.964 "num_base_bdevs_discovered": 2, 00:29:40.964 "num_base_bdevs_operational": 2, 00:29:40.964 "process": { 00:29:40.964 "type": "rebuild", 00:29:40.964 "target": "spare", 00:29:40.964 "progress": { 00:29:40.964 "blocks": 7424, 00:29:40.964 "percent": 93 00:29:40.964 } 00:29:40.964 }, 00:29:40.964 "base_bdevs_list": [ 00:29:40.964 { 00:29:40.964 "name": "spare", 00:29:40.964 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:40.964 "is_configured": true, 00:29:40.964 "data_offset": 256, 00:29:40.964 "data_size": 7936 00:29:40.964 }, 00:29:40.964 { 00:29:40.964 "name": "BaseBdev2", 00:29:40.964 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:40.964 "is_configured": true, 00:29:40.964 "data_offset": 256, 00:29:40.964 "data_size": 7936 00:29:40.964 } 00:29:40.964 ] 00:29:40.964 }' 00:29:40.964 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:40.964 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:40.964 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:41.224 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:41.224 22:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:41.224 [2024-07-12 22:36:51.366384] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:41.224 [2024-07-12 22:36:51.366449] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:41.224 [2024-07-12 22:36:51.366540] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.162 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:42.421 "name": "raid_bdev1", 00:29:42.421 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:42.421 "strip_size_kb": 0, 00:29:42.421 "state": "online", 00:29:42.421 "raid_level": "raid1", 00:29:42.421 "superblock": true, 00:29:42.421 "num_base_bdevs": 2, 00:29:42.421 "num_base_bdevs_discovered": 2, 00:29:42.421 "num_base_bdevs_operational": 2, 00:29:42.421 "base_bdevs_list": [ 00:29:42.421 { 00:29:42.421 "name": "spare", 00:29:42.421 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:42.421 "is_configured": true, 00:29:42.421 "data_offset": 256, 00:29:42.421 "data_size": 7936 00:29:42.421 }, 00:29:42.421 { 00:29:42.421 "name": "BaseBdev2", 00:29:42.421 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:42.421 "is_configured": true, 00:29:42.421 "data_offset": 256, 00:29:42.421 "data_size": 7936 00:29:42.421 } 00:29:42.421 ] 00:29:42.421 }' 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.421 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.698 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:42.698 "name": "raid_bdev1", 00:29:42.698 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:42.698 "strip_size_kb": 0, 00:29:42.699 "state": "online", 00:29:42.699 "raid_level": "raid1", 00:29:42.699 "superblock": true, 00:29:42.699 "num_base_bdevs": 2, 00:29:42.699 "num_base_bdevs_discovered": 2, 00:29:42.699 "num_base_bdevs_operational": 2, 00:29:42.699 "base_bdevs_list": [ 00:29:42.699 { 00:29:42.699 "name": "spare", 00:29:42.699 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:42.699 "is_configured": true, 00:29:42.699 "data_offset": 256, 00:29:42.699 "data_size": 7936 00:29:42.699 }, 00:29:42.699 { 00:29:42.699 "name": "BaseBdev2", 00:29:42.699 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:42.699 "is_configured": true, 00:29:42.699 "data_offset": 256, 00:29:42.699 "data_size": 7936 00:29:42.699 } 00:29:42.699 ] 00:29:42.699 }' 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.699 22:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.963 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.963 "name": "raid_bdev1", 00:29:42.963 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:42.963 "strip_size_kb": 0, 00:29:42.963 "state": "online", 00:29:42.963 "raid_level": "raid1", 00:29:42.963 "superblock": true, 00:29:42.963 "num_base_bdevs": 2, 00:29:42.963 "num_base_bdevs_discovered": 2, 00:29:42.963 "num_base_bdevs_operational": 2, 00:29:42.963 "base_bdevs_list": [ 00:29:42.963 { 00:29:42.963 "name": "spare", 00:29:42.963 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:42.963 "is_configured": true, 00:29:42.963 "data_offset": 256, 00:29:42.963 "data_size": 7936 00:29:42.963 }, 00:29:42.963 { 00:29:42.963 "name": "BaseBdev2", 00:29:42.963 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:42.963 "is_configured": true, 00:29:42.963 "data_offset": 256, 00:29:42.963 "data_size": 7936 00:29:42.963 } 00:29:42.963 ] 00:29:42.963 }' 00:29:42.963 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.963 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.529 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:43.789 [2024-07-12 22:36:53.885356] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:43.789 [2024-07-12 22:36:53.885391] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:43.789 [2024-07-12 22:36:53.885451] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:43.789 [2024-07-12 22:36:53.885511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:43.789 [2024-07-12 22:36:53.885523] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd00370 name raid_bdev1, state offline 00:29:43.789 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.789 22:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:29:43.789 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:43.789 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:29:43.789 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:43.789 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:44.048 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:44.307 [2024-07-12 22:36:54.434790] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:44.307 [2024-07-12 22:36:54.434840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:44.307 [2024-07-12 22:36:54.434862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd00040 00:29:44.307 [2024-07-12 22:36:54.434874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:44.307 [2024-07-12 22:36:54.436409] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:44.307 [2024-07-12 22:36:54.436440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:44.307 [2024-07-12 22:36:54.436503] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:44.307 [2024-07-12 22:36:54.436533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:44.307 [2024-07-12 22:36:54.436624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:44.307 spare 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.307 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:44.307 [2024-07-12 22:36:54.536939] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd00f60 00:29:44.307 [2024-07-12 22:36:54.536963] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:44.307 [2024-07-12 22:36:54.537035] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd00de0 00:29:44.307 [2024-07-12 22:36:54.537128] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd00f60 00:29:44.307 [2024-07-12 22:36:54.537138] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd00f60 00:29:44.307 [2024-07-12 22:36:54.537209] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.565 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:44.565 "name": "raid_bdev1", 00:29:44.565 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:44.565 "strip_size_kb": 0, 00:29:44.565 "state": "online", 00:29:44.565 "raid_level": "raid1", 00:29:44.565 "superblock": true, 00:29:44.565 "num_base_bdevs": 2, 00:29:44.565 "num_base_bdevs_discovered": 2, 00:29:44.565 "num_base_bdevs_operational": 2, 00:29:44.565 "base_bdevs_list": [ 00:29:44.565 { 00:29:44.565 "name": "spare", 00:29:44.565 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:44.565 "is_configured": true, 00:29:44.565 "data_offset": 256, 00:29:44.565 "data_size": 7936 00:29:44.565 }, 00:29:44.565 { 00:29:44.565 "name": "BaseBdev2", 00:29:44.565 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:44.565 "is_configured": true, 00:29:44.565 "data_offset": 256, 00:29:44.565 "data_size": 7936 00:29:44.565 } 00:29:44.565 ] 00:29:44.565 }' 00:29:44.565 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:44.565 22:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:45.501 "name": "raid_bdev1", 00:29:45.501 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:45.501 "strip_size_kb": 0, 00:29:45.501 "state": "online", 00:29:45.501 "raid_level": "raid1", 00:29:45.501 "superblock": true, 00:29:45.501 "num_base_bdevs": 2, 00:29:45.501 "num_base_bdevs_discovered": 2, 00:29:45.501 "num_base_bdevs_operational": 2, 00:29:45.501 "base_bdevs_list": [ 00:29:45.501 { 00:29:45.501 "name": "spare", 00:29:45.501 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:45.501 "is_configured": true, 00:29:45.501 "data_offset": 256, 00:29:45.501 "data_size": 7936 00:29:45.501 }, 00:29:45.501 { 00:29:45.501 "name": "BaseBdev2", 00:29:45.501 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:45.501 "is_configured": true, 00:29:45.501 "data_offset": 256, 00:29:45.501 "data_size": 7936 00:29:45.501 } 00:29:45.501 ] 00:29:45.501 }' 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:45.501 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:45.760 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:45.760 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.760 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:45.760 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:45.760 22:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:46.020 [2024-07-12 22:36:56.211614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.020 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:46.279 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:46.279 "name": "raid_bdev1", 00:29:46.279 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:46.279 "strip_size_kb": 0, 00:29:46.279 "state": "online", 00:29:46.279 "raid_level": "raid1", 00:29:46.279 "superblock": true, 00:29:46.279 "num_base_bdevs": 2, 00:29:46.279 "num_base_bdevs_discovered": 1, 00:29:46.279 "num_base_bdevs_operational": 1, 00:29:46.279 "base_bdevs_list": [ 00:29:46.279 { 00:29:46.279 "name": null, 00:29:46.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.279 "is_configured": false, 00:29:46.279 "data_offset": 256, 00:29:46.279 "data_size": 7936 00:29:46.279 }, 00:29:46.279 { 00:29:46.279 "name": "BaseBdev2", 00:29:46.279 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:46.279 "is_configured": true, 00:29:46.279 "data_offset": 256, 00:29:46.279 "data_size": 7936 00:29:46.279 } 00:29:46.279 ] 00:29:46.279 }' 00:29:46.279 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:46.280 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:46.848 22:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:46.848 [2024-07-12 22:36:57.142074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:46.848 [2024-07-12 22:36:57.142226] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:46.848 [2024-07-12 22:36:57.142242] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:46.848 [2024-07-12 22:36:57.142272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:46.848 [2024-07-12 22:36:57.145796] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd023a0 00:29:46.848 [2024-07-12 22:36:57.147218] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:46.848 22:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:48.229 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:48.229 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:48.229 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:48.229 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:48.229 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:48.230 "name": "raid_bdev1", 00:29:48.230 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:48.230 "strip_size_kb": 0, 00:29:48.230 "state": "online", 00:29:48.230 "raid_level": "raid1", 00:29:48.230 "superblock": true, 00:29:48.230 "num_base_bdevs": 2, 00:29:48.230 "num_base_bdevs_discovered": 2, 00:29:48.230 "num_base_bdevs_operational": 2, 00:29:48.230 "process": { 00:29:48.230 "type": "rebuild", 00:29:48.230 "target": "spare", 00:29:48.230 "progress": { 00:29:48.230 "blocks": 3072, 00:29:48.230 "percent": 38 00:29:48.230 } 00:29:48.230 }, 00:29:48.230 "base_bdevs_list": [ 00:29:48.230 { 00:29:48.230 "name": "spare", 00:29:48.230 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:48.230 "is_configured": true, 00:29:48.230 "data_offset": 256, 00:29:48.230 "data_size": 7936 00:29:48.230 }, 00:29:48.230 { 00:29:48.230 "name": "BaseBdev2", 00:29:48.230 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:48.230 "is_configured": true, 00:29:48.230 "data_offset": 256, 00:29:48.230 "data_size": 7936 00:29:48.230 } 00:29:48.230 ] 00:29:48.230 }' 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:48.230 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:48.490 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:48.490 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:48.490 [2024-07-12 22:36:58.784708] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:48.750 [2024-07-12 22:36:58.860723] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:48.750 [2024-07-12 22:36:58.860767] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:48.750 [2024-07-12 22:36:58.860782] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:48.750 [2024-07-12 22:36:58.860790] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.750 22:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:49.317 22:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.317 "name": "raid_bdev1", 00:29:49.317 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:49.317 "strip_size_kb": 0, 00:29:49.317 "state": "online", 00:29:49.317 "raid_level": "raid1", 00:29:49.317 "superblock": true, 00:29:49.317 "num_base_bdevs": 2, 00:29:49.317 "num_base_bdevs_discovered": 1, 00:29:49.317 "num_base_bdevs_operational": 1, 00:29:49.317 "base_bdevs_list": [ 00:29:49.317 { 00:29:49.317 "name": null, 00:29:49.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:49.317 "is_configured": false, 00:29:49.317 "data_offset": 256, 00:29:49.317 "data_size": 7936 00:29:49.317 }, 00:29:49.317 { 00:29:49.317 "name": "BaseBdev2", 00:29:49.317 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:49.317 "is_configured": true, 00:29:49.317 "data_offset": 256, 00:29:49.317 "data_size": 7936 00:29:49.317 } 00:29:49.317 ] 00:29:49.317 }' 00:29:49.317 22:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.317 22:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:49.884 22:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:50.144 [2024-07-12 22:37:00.220647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:50.144 [2024-07-12 22:37:00.220707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:50.144 [2024-07-12 22:37:00.220731] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcffc80 00:29:50.144 [2024-07-12 22:37:00.220743] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:50.144 [2024-07-12 22:37:00.220966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:50.144 [2024-07-12 22:37:00.220983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:50.144 [2024-07-12 22:37:00.221045] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:50.144 [2024-07-12 22:37:00.221057] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:50.144 [2024-07-12 22:37:00.221068] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:50.144 [2024-07-12 22:37:00.221086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:50.144 [2024-07-12 22:37:00.224868] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd002d0 00:29:50.144 spare 00:29:50.144 [2024-07-12 22:37:00.226238] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:50.144 22:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.081 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:51.339 "name": "raid_bdev1", 00:29:51.339 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:51.339 "strip_size_kb": 0, 00:29:51.339 "state": "online", 00:29:51.339 "raid_level": "raid1", 00:29:51.339 "superblock": true, 00:29:51.339 "num_base_bdevs": 2, 00:29:51.339 "num_base_bdevs_discovered": 2, 00:29:51.339 "num_base_bdevs_operational": 2, 00:29:51.339 "process": { 00:29:51.339 "type": "rebuild", 00:29:51.339 "target": "spare", 00:29:51.339 "progress": { 00:29:51.339 "blocks": 3072, 00:29:51.339 "percent": 38 00:29:51.339 } 00:29:51.339 }, 00:29:51.339 "base_bdevs_list": [ 00:29:51.339 { 00:29:51.339 "name": "spare", 00:29:51.339 "uuid": "5b71983b-7c18-5547-b740-76032609eda0", 00:29:51.339 "is_configured": true, 00:29:51.339 "data_offset": 256, 00:29:51.339 "data_size": 7936 00:29:51.339 }, 00:29:51.339 { 00:29:51.339 "name": "BaseBdev2", 00:29:51.339 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:51.339 "is_configured": true, 00:29:51.339 "data_offset": 256, 00:29:51.339 "data_size": 7936 00:29:51.339 } 00:29:51.339 ] 00:29:51.339 }' 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:51.339 22:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:51.906 [2024-07-12 22:37:02.063631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:51.906 [2024-07-12 22:37:02.141175] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:51.906 [2024-07-12 22:37:02.141223] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:51.906 [2024-07-12 22:37:02.141239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:51.906 [2024-07-12 22:37:02.141248] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:51.906 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.907 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:52.166 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.166 "name": "raid_bdev1", 00:29:52.166 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:52.166 "strip_size_kb": 0, 00:29:52.166 "state": "online", 00:29:52.166 "raid_level": "raid1", 00:29:52.166 "superblock": true, 00:29:52.166 "num_base_bdevs": 2, 00:29:52.166 "num_base_bdevs_discovered": 1, 00:29:52.166 "num_base_bdevs_operational": 1, 00:29:52.166 "base_bdevs_list": [ 00:29:52.166 { 00:29:52.166 "name": null, 00:29:52.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:52.166 "is_configured": false, 00:29:52.166 "data_offset": 256, 00:29:52.166 "data_size": 7936 00:29:52.166 }, 00:29:52.166 { 00:29:52.166 "name": "BaseBdev2", 00:29:52.166 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:52.166 "is_configured": true, 00:29:52.166 "data_offset": 256, 00:29:52.166 "data_size": 7936 00:29:52.166 } 00:29:52.166 ] 00:29:52.166 }' 00:29:52.166 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.166 22:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.735 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:52.994 "name": "raid_bdev1", 00:29:52.994 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:52.994 "strip_size_kb": 0, 00:29:52.994 "state": "online", 00:29:52.994 "raid_level": "raid1", 00:29:52.994 "superblock": true, 00:29:52.994 "num_base_bdevs": 2, 00:29:52.994 "num_base_bdevs_discovered": 1, 00:29:52.994 "num_base_bdevs_operational": 1, 00:29:52.994 "base_bdevs_list": [ 00:29:52.994 { 00:29:52.994 "name": null, 00:29:52.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:52.994 "is_configured": false, 00:29:52.994 "data_offset": 256, 00:29:52.994 "data_size": 7936 00:29:52.994 }, 00:29:52.994 { 00:29:52.994 "name": "BaseBdev2", 00:29:52.994 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:52.994 "is_configured": true, 00:29:52.994 "data_offset": 256, 00:29:52.994 "data_size": 7936 00:29:52.994 } 00:29:52.994 ] 00:29:52.994 }' 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:52.994 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:53.252 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:53.511 [2024-07-12 22:37:03.765689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:53.511 [2024-07-12 22:37:03.765741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:53.511 [2024-07-12 22:37:03.765763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb667e0 00:29:53.511 [2024-07-12 22:37:03.765775] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:53.511 [2024-07-12 22:37:03.765962] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:53.511 [2024-07-12 22:37:03.765979] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:53.511 [2024-07-12 22:37:03.766027] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:53.511 [2024-07-12 22:37:03.766039] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:53.511 [2024-07-12 22:37:03.766049] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:53.511 BaseBdev1 00:29:53.511 22:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:54.890 "name": "raid_bdev1", 00:29:54.890 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:54.890 "strip_size_kb": 0, 00:29:54.890 "state": "online", 00:29:54.890 "raid_level": "raid1", 00:29:54.890 "superblock": true, 00:29:54.890 "num_base_bdevs": 2, 00:29:54.890 "num_base_bdevs_discovered": 1, 00:29:54.890 "num_base_bdevs_operational": 1, 00:29:54.890 "base_bdevs_list": [ 00:29:54.890 { 00:29:54.890 "name": null, 00:29:54.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:54.890 "is_configured": false, 00:29:54.890 "data_offset": 256, 00:29:54.890 "data_size": 7936 00:29:54.890 }, 00:29:54.890 { 00:29:54.890 "name": "BaseBdev2", 00:29:54.890 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:54.890 "is_configured": true, 00:29:54.890 "data_offset": 256, 00:29:54.890 "data_size": 7936 00:29:54.890 } 00:29:54.890 ] 00:29:54.890 }' 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:54.890 22:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:55.458 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:55.716 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:55.716 "name": "raid_bdev1", 00:29:55.716 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:55.716 "strip_size_kb": 0, 00:29:55.717 "state": "online", 00:29:55.717 "raid_level": "raid1", 00:29:55.717 "superblock": true, 00:29:55.717 "num_base_bdevs": 2, 00:29:55.717 "num_base_bdevs_discovered": 1, 00:29:55.717 "num_base_bdevs_operational": 1, 00:29:55.717 "base_bdevs_list": [ 00:29:55.717 { 00:29:55.717 "name": null, 00:29:55.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:55.717 "is_configured": false, 00:29:55.717 "data_offset": 256, 00:29:55.717 "data_size": 7936 00:29:55.717 }, 00:29:55.717 { 00:29:55.717 "name": "BaseBdev2", 00:29:55.717 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:55.717 "is_configured": true, 00:29:55.717 "data_offset": 256, 00:29:55.717 "data_size": 7936 00:29:55.717 } 00:29:55.717 ] 00:29:55.717 }' 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:55.717 22:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:55.975 [2024-07-12 22:37:06.136022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:55.975 [2024-07-12 22:37:06.136153] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:55.975 [2024-07-12 22:37:06.136169] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:55.975 request: 00:29:55.975 { 00:29:55.975 "base_bdev": "BaseBdev1", 00:29:55.975 "raid_bdev": "raid_bdev1", 00:29:55.975 "method": "bdev_raid_add_base_bdev", 00:29:55.975 "req_id": 1 00:29:55.975 } 00:29:55.975 Got JSON-RPC error response 00:29:55.975 response: 00:29:55.975 { 00:29:55.975 "code": -22, 00:29:55.975 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:55.975 } 00:29:55.975 22:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:55.975 22:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:55.975 22:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:55.975 22:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:55.975 22:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.911 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.169 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:57.169 "name": "raid_bdev1", 00:29:57.169 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:57.169 "strip_size_kb": 0, 00:29:57.169 "state": "online", 00:29:57.169 "raid_level": "raid1", 00:29:57.169 "superblock": true, 00:29:57.169 "num_base_bdevs": 2, 00:29:57.169 "num_base_bdevs_discovered": 1, 00:29:57.169 "num_base_bdevs_operational": 1, 00:29:57.169 "base_bdevs_list": [ 00:29:57.169 { 00:29:57.169 "name": null, 00:29:57.169 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:57.169 "is_configured": false, 00:29:57.169 "data_offset": 256, 00:29:57.169 "data_size": 7936 00:29:57.169 }, 00:29:57.169 { 00:29:57.169 "name": "BaseBdev2", 00:29:57.169 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:57.169 "is_configured": true, 00:29:57.169 "data_offset": 256, 00:29:57.169 "data_size": 7936 00:29:57.169 } 00:29:57.169 ] 00:29:57.169 }' 00:29:57.169 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:57.169 22:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:57.734 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.993 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:57.993 "name": "raid_bdev1", 00:29:57.993 "uuid": "c0c866f2-1f44-49ed-92c6-86cf6c2d896b", 00:29:57.993 "strip_size_kb": 0, 00:29:57.993 "state": "online", 00:29:57.993 "raid_level": "raid1", 00:29:57.993 "superblock": true, 00:29:57.993 "num_base_bdevs": 2, 00:29:57.993 "num_base_bdevs_discovered": 1, 00:29:57.993 "num_base_bdevs_operational": 1, 00:29:57.993 "base_bdevs_list": [ 00:29:57.993 { 00:29:57.993 "name": null, 00:29:57.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:57.993 "is_configured": false, 00:29:57.993 "data_offset": 256, 00:29:57.993 "data_size": 7936 00:29:57.993 }, 00:29:57.993 { 00:29:57.993 "name": "BaseBdev2", 00:29:57.993 "uuid": "5f860ee0-287b-536a-b95b-3c75ac3f3384", 00:29:57.993 "is_configured": true, 00:29:57.993 "data_offset": 256, 00:29:57.993 "data_size": 7936 00:29:57.993 } 00:29:57.993 ] 00:29:57.993 }' 00:29:57.993 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:57.993 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 3580636 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 3580636 ']' 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 3580636 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3580636 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3580636' 00:29:58.269 killing process with pid 3580636 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 3580636 00:29:58.269 Received shutdown signal, test time was about 60.000000 seconds 00:29:58.269 00:29:58.269 Latency(us) 00:29:58.269 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.269 =================================================================================================================== 00:29:58.269 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:58.269 [2024-07-12 22:37:08.412805] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:58.269 [2024-07-12 22:37:08.412902] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:58.269 [2024-07-12 22:37:08.412956] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:58.269 [2024-07-12 22:37:08.412969] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd00f60 name raid_bdev1, state offline 00:29:58.269 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 3580636 00:29:58.269 [2024-07-12 22:37:08.445045] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:58.589 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:58.589 00:29:58.589 real 0m28.736s 00:29:58.589 user 0m45.610s 00:29:58.589 sys 0m3.911s 00:29:58.589 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:58.589 22:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:58.589 ************************************ 00:29:58.589 END TEST raid_rebuild_test_sb_md_interleaved 00:29:58.589 ************************************ 00:29:58.589 22:37:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:58.589 22:37:08 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:58.589 22:37:08 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:58.589 22:37:08 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 3580636 ']' 00:29:58.589 22:37:08 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 3580636 00:29:58.589 22:37:08 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:58.589 00:29:58.589 real 18m22.339s 00:29:58.589 user 31m7.055s 00:29:58.589 sys 3m19.791s 00:29:58.589 22:37:08 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:58.589 22:37:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:58.589 ************************************ 00:29:58.589 END TEST bdev_raid 00:29:58.589 ************************************ 00:29:58.589 22:37:08 -- common/autotest_common.sh@1142 -- # return 0 00:29:58.589 22:37:08 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:58.589 22:37:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:58.589 22:37:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:58.589 22:37:08 -- common/autotest_common.sh@10 -- # set +x 00:29:58.589 ************************************ 00:29:58.589 START TEST bdevperf_config 00:29:58.589 ************************************ 00:29:58.589 22:37:08 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:58.589 * Looking for test storage... 00:29:58.589 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:58.589 22:37:08 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:58.589 22:37:08 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:58.848 22:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.849 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.849 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.849 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.849 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:58.849 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:58.849 22:37:08 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:02.138 22:37:11 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 22:37:09.008304] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:02.138 [2024-07-12 22:37:09.008382] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3584793 ] 00:30:02.138 Using job config with 4 jobs 00:30:02.138 [2024-07-12 22:37:09.149107] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.138 [2024-07-12 22:37:09.276662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.138 cpumask for '\''job0'\'' is too big 00:30:02.138 cpumask for '\''job1'\'' is too big 00:30:02.138 cpumask for '\''job2'\'' is too big 00:30:02.138 cpumask for '\''job3'\'' is too big 00:30:02.138 Running I/O for 2 seconds... 00:30:02.138 00:30:02.138 Latency(us) 00:30:02.138 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.01 23889.50 23.33 0.00 0.00 10703.66 1880.60 16526.47 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.02 23867.50 23.31 0.00 0.00 10689.45 1866.35 14588.88 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.02 23908.66 23.35 0.00 0.00 10645.59 1866.35 12708.29 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.03 23886.80 23.33 0.00 0.00 10632.02 1866.35 11283.59 00:30:02.138 =================================================================================================================== 00:30:02.138 Total : 95552.45 93.31 0.00 0.00 10667.60 1866.35 16526.47' 00:30:02.138 22:37:11 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 22:37:09.008304] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:02.138 [2024-07-12 22:37:09.008382] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3584793 ] 00:30:02.138 Using job config with 4 jobs 00:30:02.138 [2024-07-12 22:37:09.149107] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.138 [2024-07-12 22:37:09.276662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.138 cpumask for '\''job0'\'' is too big 00:30:02.138 cpumask for '\''job1'\'' is too big 00:30:02.138 cpumask for '\''job2'\'' is too big 00:30:02.138 cpumask for '\''job3'\'' is too big 00:30:02.138 Running I/O for 2 seconds... 00:30:02.138 00:30:02.138 Latency(us) 00:30:02.138 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.01 23889.50 23.33 0.00 0.00 10703.66 1880.60 16526.47 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.02 23867.50 23.31 0.00 0.00 10689.45 1866.35 14588.88 00:30:02.138 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.138 Malloc0 : 2.02 23908.66 23.35 0.00 0.00 10645.59 1866.35 12708.29 00:30:02.139 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.139 Malloc0 : 2.03 23886.80 23.33 0.00 0.00 10632.02 1866.35 11283.59 00:30:02.139 =================================================================================================================== 00:30:02.139 Total : 95552.45 93.31 0.00 0.00 10667.60 1866.35 16526.47' 00:30:02.139 22:37:11 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 22:37:09.008304] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:02.139 [2024-07-12 22:37:09.008382] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3584793 ] 00:30:02.139 Using job config with 4 jobs 00:30:02.139 [2024-07-12 22:37:09.149107] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.139 [2024-07-12 22:37:09.276662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.139 cpumask for '\''job0'\'' is too big 00:30:02.139 cpumask for '\''job1'\'' is too big 00:30:02.139 cpumask for '\''job2'\'' is too big 00:30:02.139 cpumask for '\''job3'\'' is too big 00:30:02.139 Running I/O for 2 seconds... 00:30:02.139 00:30:02.139 Latency(us) 00:30:02.139 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.139 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.139 Malloc0 : 2.01 23889.50 23.33 0.00 0.00 10703.66 1880.60 16526.47 00:30:02.139 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.139 Malloc0 : 2.02 23867.50 23.31 0.00 0.00 10689.45 1866.35 14588.88 00:30:02.139 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.139 Malloc0 : 2.02 23908.66 23.35 0.00 0.00 10645.59 1866.35 12708.29 00:30:02.139 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:02.139 Malloc0 : 2.03 23886.80 23.33 0.00 0.00 10632.02 1866.35 11283.59 00:30:02.139 =================================================================================================================== 00:30:02.139 Total : 95552.45 93.31 0.00 0.00 10667.60 1866.35 16526.47' 00:30:02.139 22:37:11 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:02.139 22:37:11 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:02.139 22:37:11 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:30:02.139 22:37:11 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:02.139 [2024-07-12 22:37:11.756233] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:02.139 [2024-07-12 22:37:11.756281] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585148 ] 00:30:02.139 [2024-07-12 22:37:11.880713] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.139 [2024-07-12 22:37:11.995768] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.139 cpumask for 'job0' is too big 00:30:02.139 cpumask for 'job1' is too big 00:30:02.139 cpumask for 'job2' is too big 00:30:02.139 cpumask for 'job3' is too big 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:30:04.698 Running I/O for 2 seconds... 00:30:04.698 00:30:04.698 Latency(us) 00:30:04.698 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:04.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:04.698 Malloc0 : 2.02 23816.03 23.26 0.00 0.00 10736.13 1880.60 16412.49 00:30:04.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:04.698 Malloc0 : 2.02 23793.69 23.24 0.00 0.00 10721.63 1866.35 14531.90 00:30:04.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:04.698 Malloc0 : 2.02 23771.84 23.21 0.00 0.00 10707.80 1852.10 12708.29 00:30:04.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:04.698 Malloc0 : 2.03 23749.94 23.19 0.00 0.00 10693.32 1837.86 10998.65 00:30:04.698 =================================================================================================================== 00:30:04.698 Total : 95131.51 92.90 0.00 0.00 10714.72 1837.86 16412.49' 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:04.698 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:04.698 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:04.698 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:04.698 22:37:14 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 22:37:14.463684] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:07.236 [2024-07-12 22:37:14.463749] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585504 ] 00:30:07.236 Using job config with 3 jobs 00:30:07.236 [2024-07-12 22:37:14.585382] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:07.236 [2024-07-12 22:37:14.698198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:07.236 cpumask for '\''job0'\'' is too big 00:30:07.236 cpumask for '\''job1'\'' is too big 00:30:07.236 cpumask for '\''job2'\'' is too big 00:30:07.236 Running I/O for 2 seconds... 00:30:07.236 00:30:07.236 Latency(us) 00:30:07.236 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32261.43 31.51 0.00 0.00 7927.93 1823.61 11625.52 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32231.74 31.48 0.00 0.00 7917.84 1816.49 9801.91 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32202.18 31.45 0.00 0.00 7907.58 1809.36 8149.26 00:30:07.236 =================================================================================================================== 00:30:07.236 Total : 96695.35 94.43 0.00 0.00 7917.79 1809.36 11625.52' 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 22:37:14.463684] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:07.236 [2024-07-12 22:37:14.463749] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585504 ] 00:30:07.236 Using job config with 3 jobs 00:30:07.236 [2024-07-12 22:37:14.585382] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:07.236 [2024-07-12 22:37:14.698198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:07.236 cpumask for '\''job0'\'' is too big 00:30:07.236 cpumask for '\''job1'\'' is too big 00:30:07.236 cpumask for '\''job2'\'' is too big 00:30:07.236 Running I/O for 2 seconds... 00:30:07.236 00:30:07.236 Latency(us) 00:30:07.236 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32261.43 31.51 0.00 0.00 7927.93 1823.61 11625.52 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32231.74 31.48 0.00 0.00 7917.84 1816.49 9801.91 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32202.18 31.45 0.00 0.00 7907.58 1809.36 8149.26 00:30:07.236 =================================================================================================================== 00:30:07.236 Total : 96695.35 94.43 0.00 0.00 7917.79 1809.36 11625.52' 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 22:37:14.463684] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:07.236 [2024-07-12 22:37:14.463749] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585504 ] 00:30:07.236 Using job config with 3 jobs 00:30:07.236 [2024-07-12 22:37:14.585382] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:07.236 [2024-07-12 22:37:14.698198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:07.236 cpumask for '\''job0'\'' is too big 00:30:07.236 cpumask for '\''job1'\'' is too big 00:30:07.236 cpumask for '\''job2'\'' is too big 00:30:07.236 Running I/O for 2 seconds... 00:30:07.236 00:30:07.236 Latency(us) 00:30:07.236 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32261.43 31.51 0.00 0.00 7927.93 1823.61 11625.52 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32231.74 31.48 0.00 0.00 7917.84 1816.49 9801.91 00:30:07.236 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:07.236 Malloc0 : 2.02 32202.18 31.45 0.00 0.00 7907.58 1809.36 8149.26 00:30:07.236 =================================================================================================================== 00:30:07.236 Total : 96695.35 94.43 0.00 0.00 7917.79 1809.36 11625.52' 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:07.236 22:37:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:07.237 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:07.237 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:07.237 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:07.237 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:07.237 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:07.237 22:37:17 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:09.774 22:37:19 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 22:37:17.200225] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:09.774 [2024-07-12 22:37:17.200291] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585858 ] 00:30:09.774 Using job config with 4 jobs 00:30:09.774 [2024-07-12 22:37:17.340052] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:09.774 [2024-07-12 22:37:17.461871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.774 cpumask for '\''job0'\'' is too big 00:30:09.774 cpumask for '\''job1'\'' is too big 00:30:09.774 cpumask for '\''job2'\'' is too big 00:30:09.774 cpumask for '\''job3'\'' is too big 00:30:09.774 Running I/O for 2 seconds... 00:30:09.774 00:30:09.775 Latency(us) 00:30:09.775 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.02 11893.72 11.61 0.00 0.00 21504.42 3846.68 33508.84 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.04 11898.12 11.62 0.00 0.00 21474.60 4701.50 33280.89 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11887.34 11.61 0.00 0.00 21413.91 3875.17 29405.72 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21411.61 4729.99 29405.72 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11865.61 11.59 0.00 0.00 21356.35 3818.18 25530.55 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11854.64 11.58 0.00 0.00 21356.84 4644.51 25530.55 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11843.94 11.57 0.00 0.00 21297.78 3818.18 21883.33 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.06 11833.03 11.56 0.00 0.00 21298.55 4644.51 21883.33 00:30:09.775 =================================================================================================================== 00:30:09.775 Total : 94952.73 92.73 0.00 0.00 21389.11 3818.18 33508.84' 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 22:37:17.200225] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:09.775 [2024-07-12 22:37:17.200291] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585858 ] 00:30:09.775 Using job config with 4 jobs 00:30:09.775 [2024-07-12 22:37:17.340052] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:09.775 [2024-07-12 22:37:17.461871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.775 cpumask for '\''job0'\'' is too big 00:30:09.775 cpumask for '\''job1'\'' is too big 00:30:09.775 cpumask for '\''job2'\'' is too big 00:30:09.775 cpumask for '\''job3'\'' is too big 00:30:09.775 Running I/O for 2 seconds... 00:30:09.775 00:30:09.775 Latency(us) 00:30:09.775 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.02 11893.72 11.61 0.00 0.00 21504.42 3846.68 33508.84 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.04 11898.12 11.62 0.00 0.00 21474.60 4701.50 33280.89 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11887.34 11.61 0.00 0.00 21413.91 3875.17 29405.72 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21411.61 4729.99 29405.72 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11865.61 11.59 0.00 0.00 21356.35 3818.18 25530.55 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11854.64 11.58 0.00 0.00 21356.84 4644.51 25530.55 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11843.94 11.57 0.00 0.00 21297.78 3818.18 21883.33 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.06 11833.03 11.56 0.00 0.00 21298.55 4644.51 21883.33 00:30:09.775 =================================================================================================================== 00:30:09.775 Total : 94952.73 92.73 0.00 0.00 21389.11 3818.18 33508.84' 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 22:37:17.200225] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:09.775 [2024-07-12 22:37:17.200291] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585858 ] 00:30:09.775 Using job config with 4 jobs 00:30:09.775 [2024-07-12 22:37:17.340052] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:09.775 [2024-07-12 22:37:17.461871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.775 cpumask for '\''job0'\'' is too big 00:30:09.775 cpumask for '\''job1'\'' is too big 00:30:09.775 cpumask for '\''job2'\'' is too big 00:30:09.775 cpumask for '\''job3'\'' is too big 00:30:09.775 Running I/O for 2 seconds... 00:30:09.775 00:30:09.775 Latency(us) 00:30:09.775 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.02 11893.72 11.61 0.00 0.00 21504.42 3846.68 33508.84 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.04 11898.12 11.62 0.00 0.00 21474.60 4701.50 33280.89 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11887.34 11.61 0.00 0.00 21413.91 3875.17 29405.72 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21411.61 4729.99 29405.72 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11865.61 11.59 0.00 0.00 21356.35 3818.18 25530.55 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.05 11854.64 11.58 0.00 0.00 21356.84 4644.51 25530.55 00:30:09.775 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc0 : 2.05 11843.94 11.57 0.00 0.00 21297.78 3818.18 21883.33 00:30:09.775 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:09.775 Malloc1 : 2.06 11833.03 11.56 0.00 0.00 21298.55 4644.51 21883.33 00:30:09.775 =================================================================================================================== 00:30:09.775 Total : 94952.73 92.73 0.00 0.00 21389.11 3818.18 33508.84' 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:09.775 22:37:19 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:09.775 00:30:09.775 real 0m11.140s 00:30:09.775 user 0m9.886s 00:30:09.775 sys 0m1.096s 00:30:09.775 22:37:19 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:09.775 22:37:19 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:09.775 ************************************ 00:30:09.775 END TEST bdevperf_config 00:30:09.775 ************************************ 00:30:09.775 22:37:20 -- common/autotest_common.sh@1142 -- # return 0 00:30:09.775 22:37:20 -- spdk/autotest.sh@192 -- # uname -s 00:30:09.775 22:37:20 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:30:09.775 22:37:20 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:09.775 22:37:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:09.775 22:37:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:09.775 22:37:20 -- common/autotest_common.sh@10 -- # set +x 00:30:09.775 ************************************ 00:30:09.775 START TEST reactor_set_interrupt 00:30:09.775 ************************************ 00:30:09.775 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:10.037 * Looking for test storage... 00:30:10.037 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:10.037 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:10.037 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:10.037 22:37:20 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:10.038 22:37:20 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:10.038 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:10.038 #define SPDK_CONFIG_H 00:30:10.038 #define SPDK_CONFIG_APPS 1 00:30:10.038 #define SPDK_CONFIG_ARCH native 00:30:10.038 #undef SPDK_CONFIG_ASAN 00:30:10.038 #undef SPDK_CONFIG_AVAHI 00:30:10.038 #undef SPDK_CONFIG_CET 00:30:10.038 #define SPDK_CONFIG_COVERAGE 1 00:30:10.038 #define SPDK_CONFIG_CROSS_PREFIX 00:30:10.038 #define SPDK_CONFIG_CRYPTO 1 00:30:10.038 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:10.038 #undef SPDK_CONFIG_CUSTOMOCF 00:30:10.038 #undef SPDK_CONFIG_DAOS 00:30:10.038 #define SPDK_CONFIG_DAOS_DIR 00:30:10.038 #define SPDK_CONFIG_DEBUG 1 00:30:10.038 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:10.038 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:10.038 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:10.038 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:10.038 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:10.038 #undef SPDK_CONFIG_DPDK_UADK 00:30:10.038 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:10.038 #define SPDK_CONFIG_EXAMPLES 1 00:30:10.038 #undef SPDK_CONFIG_FC 00:30:10.038 #define SPDK_CONFIG_FC_PATH 00:30:10.038 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:10.038 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:10.038 #undef SPDK_CONFIG_FUSE 00:30:10.038 #undef SPDK_CONFIG_FUZZER 00:30:10.038 #define SPDK_CONFIG_FUZZER_LIB 00:30:10.038 #undef SPDK_CONFIG_GOLANG 00:30:10.038 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:10.038 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:10.038 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:10.038 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:10.038 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:10.038 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:10.038 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:10.038 #define SPDK_CONFIG_IDXD 1 00:30:10.038 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:10.038 #define SPDK_CONFIG_IPSEC_MB 1 00:30:10.038 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:10.038 #define SPDK_CONFIG_ISAL 1 00:30:10.038 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:10.038 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:10.038 #define SPDK_CONFIG_LIBDIR 00:30:10.038 #undef SPDK_CONFIG_LTO 00:30:10.038 #define SPDK_CONFIG_MAX_LCORES 128 00:30:10.038 #define SPDK_CONFIG_NVME_CUSE 1 00:30:10.038 #undef SPDK_CONFIG_OCF 00:30:10.038 #define SPDK_CONFIG_OCF_PATH 00:30:10.038 #define SPDK_CONFIG_OPENSSL_PATH 00:30:10.038 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:10.038 #define SPDK_CONFIG_PGO_DIR 00:30:10.038 #undef SPDK_CONFIG_PGO_USE 00:30:10.038 #define SPDK_CONFIG_PREFIX /usr/local 00:30:10.038 #undef SPDK_CONFIG_RAID5F 00:30:10.038 #undef SPDK_CONFIG_RBD 00:30:10.038 #define SPDK_CONFIG_RDMA 1 00:30:10.038 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:10.038 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:10.038 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:10.038 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:10.038 #define SPDK_CONFIG_SHARED 1 00:30:10.038 #undef SPDK_CONFIG_SMA 00:30:10.038 #define SPDK_CONFIG_TESTS 1 00:30:10.038 #undef SPDK_CONFIG_TSAN 00:30:10.038 #define SPDK_CONFIG_UBLK 1 00:30:10.038 #define SPDK_CONFIG_UBSAN 1 00:30:10.038 #undef SPDK_CONFIG_UNIT_TESTS 00:30:10.038 #undef SPDK_CONFIG_URING 00:30:10.038 #define SPDK_CONFIG_URING_PATH 00:30:10.038 #undef SPDK_CONFIG_URING_ZNS 00:30:10.038 #undef SPDK_CONFIG_USDT 00:30:10.038 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:10.038 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:10.038 #undef SPDK_CONFIG_VFIO_USER 00:30:10.038 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:10.038 #define SPDK_CONFIG_VHOST 1 00:30:10.038 #define SPDK_CONFIG_VIRTIO 1 00:30:10.038 #undef SPDK_CONFIG_VTUNE 00:30:10.038 #define SPDK_CONFIG_VTUNE_DIR 00:30:10.038 #define SPDK_CONFIG_WERROR 1 00:30:10.038 #define SPDK_CONFIG_WPDK_DIR 00:30:10.038 #undef SPDK_CONFIG_XNVME 00:30:10.038 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:10.038 22:37:20 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:10.038 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:10.038 22:37:20 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:10.038 22:37:20 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:10.039 22:37:20 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.039 22:37:20 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.039 22:37:20 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.039 22:37:20 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:10.039 22:37:20 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:10.039 22:37:20 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:10.039 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 3586249 ]] 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 3586249 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.Cjb4m5 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.Cjb4m5/tests/interrupt /tmp/spdk.Cjb4m5 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:10.040 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88913358848 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5595156480 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892292096 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9412608 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253790720 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=466944 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:10.041 * Looking for test storage... 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88913358848 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7809748992 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.041 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3586290 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:10.041 22:37:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3586290 /var/tmp/spdk.sock 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 3586290 ']' 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:10.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:10.041 22:37:20 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:10.041 [2024-07-12 22:37:20.336896] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:10.041 [2024-07-12 22:37:20.336975] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3586290 ] 00:30:10.301 [2024-07-12 22:37:20.466816] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:10.301 [2024-07-12 22:37:20.574231] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:10.301 [2024-07-12 22:37:20.574254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:10.301 [2024-07-12 22:37:20.574258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:10.559 [2024-07-12 22:37:20.645638] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:11.126 22:37:21 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:11.126 22:37:21 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:11.126 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:11.126 22:37:21 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:11.385 Malloc0 00:30:11.385 Malloc1 00:30:11.385 Malloc2 00:30:11.385 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:11.385 22:37:21 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:11.385 22:37:21 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:11.385 22:37:21 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:11.385 5000+0 records in 00:30:11.385 5000+0 records out 00:30:11.385 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0266776 s, 384 MB/s 00:30:11.385 22:37:21 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:11.644 AIO0 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 3586290 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 3586290 without_thd 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3586290 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:11.644 22:37:21 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:11.904 22:37:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:12.164 spdk_thread ids are 1 on reactor0. 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3586290 0 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3586290 0 idle 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:12.164 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586290 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.37 reactor_0' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586290 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.37 reactor_0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3586290 1 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3586290 1 idle 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586297 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586297 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3586290 2 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3586290 2 idle 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:12.424 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586298 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586298 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:12.683 22:37:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:12.942 [2024-07-12 22:37:23.139272] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:12.942 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:13.201 [2024-07-12 22:37:23.382954] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:13.201 [2024-07-12 22:37:23.383198] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:13.201 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:13.460 [2024-07-12 22:37:23.626922] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:13.460 [2024-07-12 22:37:23.627042] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3586290 0 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3586290 0 busy 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:13.460 22:37:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586290 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0' 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586290 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3586290 2 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3586290 2 busy 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:13.720 22:37:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:13.721 22:37:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586298 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586298 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.721 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:13.980 [2024-07-12 22:37:24.250915] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:13.980 [2024-07-12 22:37:24.251018] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3586290 2 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3586290 2 idle 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:13.980 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586298 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.62 reactor_2' 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586298 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.62 reactor_2 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:14.239 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:14.498 [2024-07-12 22:37:24.674930] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:14.498 [2024-07-12 22:37:24.675049] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:14.498 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:14.498 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:14.498 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:14.756 [2024-07-12 22:37:24.919130] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3586290 0 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3586290 0 idle 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3586290 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3586290 -w 256 00:30:14.756 22:37:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3586290 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.68 reactor_0' 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3586290 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.68 reactor_0 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:15.014 22:37:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:15.015 22:37:25 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:15.015 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:15.015 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:15.015 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:15.015 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 3586290 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 3586290 ']' 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 3586290 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3586290 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3586290' 00:30:15.015 killing process with pid 3586290 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 3586290 00:30:15.015 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 3586290 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3587062 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:15.273 22:37:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3587062 /var/tmp/spdk.sock 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 3587062 ']' 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:15.273 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:15.273 22:37:25 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:15.273 [2024-07-12 22:37:25.475187] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:15.273 [2024-07-12 22:37:25.475262] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3587062 ] 00:30:15.531 [2024-07-12 22:37:25.605897] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:15.532 [2024-07-12 22:37:25.709973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:15.532 [2024-07-12 22:37:25.710057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:15.532 [2024-07-12 22:37:25.710063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:15.532 [2024-07-12 22:37:25.785738] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:16.467 22:37:26 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:16.467 22:37:26 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:16.467 Malloc0 00:30:16.467 Malloc1 00:30:16.467 Malloc2 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:16.467 5000+0 records in 00:30:16.467 5000+0 records out 00:30:16.467 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0287309 s, 356 MB/s 00:30:16.467 22:37:26 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:16.725 AIO0 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 3587062 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 3587062 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3587062 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:16.725 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:16.982 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:17.240 spdk_thread ids are 1 on reactor0. 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3587062 0 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3587062 0 idle 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:17.240 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587062 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587062 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3587062 1 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3587062 1 idle 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:17.499 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587066 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587066 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3587062 2 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3587062 2 idle 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:17.757 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:17.758 22:37:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587067 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587067 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:17.758 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:18.016 [2024-07-12 22:37:28.298664] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:18.016 [2024-07-12 22:37:28.298862] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:18.016 [2024-07-12 22:37:28.299093] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:18.016 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:18.274 [2024-07-12 22:37:28.535180] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:18.274 [2024-07-12 22:37:28.535358] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3587062 0 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3587062 0 busy 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:18.274 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587062 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0' 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587062 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3587062 2 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3587062 2 busy 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:18.533 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587067 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2' 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587067 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:18.793 22:37:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:19.052 [2024-07-12 22:37:29.140898] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:19.052 [2024-07-12 22:37:29.141009] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3587062 2 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3587062 2 idle 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587067 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587067 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:19.052 22:37:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:19.053 22:37:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:19.053 22:37:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:19.053 22:37:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:19.053 22:37:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:19.053 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:19.312 [2024-07-12 22:37:29.578022] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:19.312 [2024-07-12 22:37:29.578157] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:19.312 [2024-07-12 22:37:29.578182] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3587062 0 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3587062 0 idle 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3587062 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3587062 -w 256 00:30:19.312 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3587062 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.68 reactor_0' 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3587062 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.68 reactor_0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:19.570 22:37:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 3587062 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 3587062 ']' 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 3587062 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3587062 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3587062' 00:30:19.570 killing process with pid 3587062 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 3587062 00:30:19.570 22:37:29 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 3587062 00:30:19.828 22:37:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:19.828 22:37:30 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:19.828 00:30:19.828 real 0m10.023s 00:30:19.828 user 0m9.438s 00:30:19.828 sys 0m2.155s 00:30:19.828 22:37:30 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:19.828 22:37:30 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:19.828 ************************************ 00:30:19.828 END TEST reactor_set_interrupt 00:30:19.828 ************************************ 00:30:19.828 22:37:30 -- common/autotest_common.sh@1142 -- # return 0 00:30:19.828 22:37:30 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:19.828 22:37:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:19.828 22:37:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:19.828 22:37:30 -- common/autotest_common.sh@10 -- # set +x 00:30:20.090 ************************************ 00:30:20.090 START TEST reap_unregistered_poller 00:30:20.090 ************************************ 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:20.090 * Looking for test storage... 00:30:20.090 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:20.090 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:20.090 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:20.090 22:37:30 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:20.091 22:37:30 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:20.091 22:37:30 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:20.091 22:37:30 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:20.091 22:37:30 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:20.091 22:37:30 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:20.091 #define SPDK_CONFIG_H 00:30:20.091 #define SPDK_CONFIG_APPS 1 00:30:20.091 #define SPDK_CONFIG_ARCH native 00:30:20.091 #undef SPDK_CONFIG_ASAN 00:30:20.091 #undef SPDK_CONFIG_AVAHI 00:30:20.091 #undef SPDK_CONFIG_CET 00:30:20.091 #define SPDK_CONFIG_COVERAGE 1 00:30:20.091 #define SPDK_CONFIG_CROSS_PREFIX 00:30:20.091 #define SPDK_CONFIG_CRYPTO 1 00:30:20.091 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:20.091 #undef SPDK_CONFIG_CUSTOMOCF 00:30:20.091 #undef SPDK_CONFIG_DAOS 00:30:20.091 #define SPDK_CONFIG_DAOS_DIR 00:30:20.091 #define SPDK_CONFIG_DEBUG 1 00:30:20.091 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:20.091 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:20.091 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:20.091 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:20.091 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:20.091 #undef SPDK_CONFIG_DPDK_UADK 00:30:20.091 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:20.091 #define SPDK_CONFIG_EXAMPLES 1 00:30:20.091 #undef SPDK_CONFIG_FC 00:30:20.091 #define SPDK_CONFIG_FC_PATH 00:30:20.091 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:20.091 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:20.091 #undef SPDK_CONFIG_FUSE 00:30:20.091 #undef SPDK_CONFIG_FUZZER 00:30:20.091 #define SPDK_CONFIG_FUZZER_LIB 00:30:20.091 #undef SPDK_CONFIG_GOLANG 00:30:20.091 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:20.091 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:20.091 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:20.091 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:20.091 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:20.091 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:20.091 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:20.091 #define SPDK_CONFIG_IDXD 1 00:30:20.091 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:20.091 #define SPDK_CONFIG_IPSEC_MB 1 00:30:20.091 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:20.091 #define SPDK_CONFIG_ISAL 1 00:30:20.091 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:20.091 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:20.091 #define SPDK_CONFIG_LIBDIR 00:30:20.091 #undef SPDK_CONFIG_LTO 00:30:20.091 #define SPDK_CONFIG_MAX_LCORES 128 00:30:20.091 #define SPDK_CONFIG_NVME_CUSE 1 00:30:20.091 #undef SPDK_CONFIG_OCF 00:30:20.091 #define SPDK_CONFIG_OCF_PATH 00:30:20.091 #define SPDK_CONFIG_OPENSSL_PATH 00:30:20.091 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:20.091 #define SPDK_CONFIG_PGO_DIR 00:30:20.091 #undef SPDK_CONFIG_PGO_USE 00:30:20.091 #define SPDK_CONFIG_PREFIX /usr/local 00:30:20.091 #undef SPDK_CONFIG_RAID5F 00:30:20.091 #undef SPDK_CONFIG_RBD 00:30:20.091 #define SPDK_CONFIG_RDMA 1 00:30:20.091 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:20.091 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:20.091 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:20.091 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:20.091 #define SPDK_CONFIG_SHARED 1 00:30:20.091 #undef SPDK_CONFIG_SMA 00:30:20.091 #define SPDK_CONFIG_TESTS 1 00:30:20.091 #undef SPDK_CONFIG_TSAN 00:30:20.091 #define SPDK_CONFIG_UBLK 1 00:30:20.091 #define SPDK_CONFIG_UBSAN 1 00:30:20.091 #undef SPDK_CONFIG_UNIT_TESTS 00:30:20.091 #undef SPDK_CONFIG_URING 00:30:20.091 #define SPDK_CONFIG_URING_PATH 00:30:20.091 #undef SPDK_CONFIG_URING_ZNS 00:30:20.091 #undef SPDK_CONFIG_USDT 00:30:20.091 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:20.091 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:20.091 #undef SPDK_CONFIG_VFIO_USER 00:30:20.091 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:20.091 #define SPDK_CONFIG_VHOST 1 00:30:20.091 #define SPDK_CONFIG_VIRTIO 1 00:30:20.091 #undef SPDK_CONFIG_VTUNE 00:30:20.091 #define SPDK_CONFIG_VTUNE_DIR 00:30:20.091 #define SPDK_CONFIG_WERROR 1 00:30:20.091 #define SPDK_CONFIG_WPDK_DIR 00:30:20.091 #undef SPDK_CONFIG_XNVME 00:30:20.091 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:20.091 22:37:30 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:20.091 22:37:30 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:20.091 22:37:30 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:20.091 22:37:30 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:20.091 22:37:30 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:20.091 22:37:30 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:20.091 22:37:30 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:20.091 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:20.092 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 3587816 ]] 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 3587816 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.V7eqsx 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.V7eqsx/tests/interrupt /tmp/spdk.V7eqsx 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88913203200 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5595312128 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892292096 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9412608 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.093 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253790720 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=466944 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:20.378 * Looking for test storage... 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88913203200 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7809904640 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.378 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3587899 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:20.378 22:37:30 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3587899 /var/tmp/spdk.sock 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 3587899 ']' 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:20.378 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:20.378 22:37:30 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:20.378 [2024-07-12 22:37:30.463842] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:20.378 [2024-07-12 22:37:30.463910] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3587899 ] 00:30:20.378 [2024-07-12 22:37:30.591449] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:20.659 [2024-07-12 22:37:30.697009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:20.659 [2024-07-12 22:37:30.697097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:20.659 [2024-07-12 22:37:30.697101] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:20.659 [2024-07-12 22:37:30.768442] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:21.228 22:37:31 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:21.228 22:37:31 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:21.228 22:37:31 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:21.228 22:37:31 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:21.228 22:37:31 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:21.228 "name": "app_thread", 00:30:21.228 "id": 1, 00:30:21.228 "active_pollers": [], 00:30:21.228 "timed_pollers": [ 00:30:21.228 { 00:30:21.228 "name": "rpc_subsystem_poll_servers", 00:30:21.228 "id": 1, 00:30:21.228 "state": "waiting", 00:30:21.228 "run_count": 0, 00:30:21.228 "busy_count": 0, 00:30:21.228 "period_ticks": 9200000 00:30:21.228 } 00:30:21.228 ], 00:30:21.228 "paused_pollers": [] 00:30:21.228 }' 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:21.228 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:21.487 22:37:31 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:21.487 22:37:31 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:21.487 22:37:31 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:21.487 5000+0 records in 00:30:21.487 5000+0 records out 00:30:21.487 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0241437 s, 424 MB/s 00:30:21.487 22:37:31 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:21.745 AIO0 00:30:21.745 22:37:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:21.745 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:22.004 "name": "app_thread", 00:30:22.004 "id": 1, 00:30:22.004 "active_pollers": [], 00:30:22.004 "timed_pollers": [ 00:30:22.004 { 00:30:22.004 "name": "rpc_subsystem_poll_servers", 00:30:22.004 "id": 1, 00:30:22.004 "state": "waiting", 00:30:22.004 "run_count": 0, 00:30:22.004 "busy_count": 0, 00:30:22.004 "period_ticks": 9200000 00:30:22.004 } 00:30:22.004 ], 00:30:22.004 "paused_pollers": [] 00:30:22.004 }' 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:22.004 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 3587899 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 3587899 ']' 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 3587899 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3587899 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3587899' 00:30:22.004 killing process with pid 3587899 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 3587899 00:30:22.004 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 3587899 00:30:22.263 22:37:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:22.263 22:37:32 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:22.263 00:30:22.263 real 0m2.381s 00:30:22.263 user 0m1.476s 00:30:22.263 sys 0m0.648s 00:30:22.263 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:22.263 22:37:32 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:22.263 ************************************ 00:30:22.263 END TEST reap_unregistered_poller 00:30:22.263 ************************************ 00:30:22.263 22:37:32 -- common/autotest_common.sh@1142 -- # return 0 00:30:22.263 22:37:32 -- spdk/autotest.sh@198 -- # uname -s 00:30:22.263 22:37:32 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:22.263 22:37:32 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:22.263 22:37:32 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:22.263 22:37:32 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:22.263 22:37:32 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:22.263 22:37:32 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:22.263 22:37:32 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:22.263 22:37:32 -- common/autotest_common.sh@10 -- # set +x 00:30:22.522 22:37:32 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:22.522 22:37:32 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:22.522 22:37:32 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:22.522 22:37:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:22.522 22:37:32 -- common/autotest_common.sh@10 -- # set +x 00:30:22.522 ************************************ 00:30:22.522 START TEST compress_compdev 00:30:22.522 ************************************ 00:30:22.522 22:37:32 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:22.522 * Looking for test storage... 00:30:22.522 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:22.522 22:37:32 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:22.522 22:37:32 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:22.522 22:37:32 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:22.522 22:37:32 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:22.522 22:37:32 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:22.522 22:37:32 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:22.522 22:37:32 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:22.522 22:37:32 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:22.522 22:37:32 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3588243 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:22.522 22:37:32 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3588243 00:30:22.522 22:37:32 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3588243 ']' 00:30:22.522 22:37:32 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:22.522 22:37:32 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:22.523 22:37:32 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:22.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:22.523 22:37:32 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:22.523 22:37:32 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:22.523 [2024-07-12 22:37:32.831508] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:22.523 [2024-07-12 22:37:32.831580] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3588243 ] 00:30:22.781 [2024-07-12 22:37:32.949797] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:22.781 [2024-07-12 22:37:33.052181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:22.781 [2024-07-12 22:37:33.052194] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:23.717 [2024-07-12 22:37:33.800650] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:23.717 22:37:33 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:23.717 22:37:33 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:23.717 22:37:33 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:23.717 22:37:33 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:23.717 22:37:33 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:24.651 [2024-07-12 22:37:34.702905] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21163c0 PMD being used: compress_qat 00:30:24.651 22:37:34 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:24.651 22:37:34 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:24.909 22:37:34 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:25.168 [ 00:30:25.168 { 00:30:25.168 "name": "Nvme0n1", 00:30:25.168 "aliases": [ 00:30:25.168 "01000000-0000-0000-5cd2-e43197705251" 00:30:25.168 ], 00:30:25.168 "product_name": "NVMe disk", 00:30:25.168 "block_size": 512, 00:30:25.168 "num_blocks": 15002931888, 00:30:25.168 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:25.168 "assigned_rate_limits": { 00:30:25.168 "rw_ios_per_sec": 0, 00:30:25.168 "rw_mbytes_per_sec": 0, 00:30:25.168 "r_mbytes_per_sec": 0, 00:30:25.168 "w_mbytes_per_sec": 0 00:30:25.168 }, 00:30:25.168 "claimed": false, 00:30:25.168 "zoned": false, 00:30:25.168 "supported_io_types": { 00:30:25.168 "read": true, 00:30:25.168 "write": true, 00:30:25.168 "unmap": true, 00:30:25.168 "flush": true, 00:30:25.168 "reset": true, 00:30:25.168 "nvme_admin": true, 00:30:25.168 "nvme_io": true, 00:30:25.168 "nvme_io_md": false, 00:30:25.168 "write_zeroes": true, 00:30:25.168 "zcopy": false, 00:30:25.168 "get_zone_info": false, 00:30:25.168 "zone_management": false, 00:30:25.168 "zone_append": false, 00:30:25.168 "compare": false, 00:30:25.168 "compare_and_write": false, 00:30:25.168 "abort": true, 00:30:25.168 "seek_hole": false, 00:30:25.168 "seek_data": false, 00:30:25.168 "copy": false, 00:30:25.168 "nvme_iov_md": false 00:30:25.168 }, 00:30:25.168 "driver_specific": { 00:30:25.168 "nvme": [ 00:30:25.168 { 00:30:25.168 "pci_address": "0000:5e:00.0", 00:30:25.168 "trid": { 00:30:25.168 "trtype": "PCIe", 00:30:25.168 "traddr": "0000:5e:00.0" 00:30:25.168 }, 00:30:25.168 "ctrlr_data": { 00:30:25.168 "cntlid": 0, 00:30:25.168 "vendor_id": "0x8086", 00:30:25.168 "model_number": "INTEL SSDPF2KX076TZO", 00:30:25.168 "serial_number": "PHAC0301002G7P6CGN", 00:30:25.168 "firmware_revision": "JCV10200", 00:30:25.168 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:25.168 "oacs": { 00:30:25.168 "security": 1, 00:30:25.168 "format": 1, 00:30:25.168 "firmware": 1, 00:30:25.168 "ns_manage": 1 00:30:25.168 }, 00:30:25.168 "multi_ctrlr": false, 00:30:25.168 "ana_reporting": false 00:30:25.168 }, 00:30:25.168 "vs": { 00:30:25.168 "nvme_version": "1.3" 00:30:25.168 }, 00:30:25.168 "ns_data": { 00:30:25.168 "id": 1, 00:30:25.168 "can_share": false 00:30:25.168 }, 00:30:25.168 "security": { 00:30:25.168 "opal": true 00:30:25.168 } 00:30:25.168 } 00:30:25.168 ], 00:30:25.168 "mp_policy": "active_passive" 00:30:25.168 } 00:30:25.168 } 00:30:25.168 ] 00:30:25.168 22:37:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:25.168 22:37:35 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:25.427 [2024-07-12 22:37:35.717480] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f7b0d0 PMD being used: compress_qat 00:30:27.961 9e2cbb0e-a63e-4f44-97b7-c23baa90a680 00:30:27.961 22:37:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:27.961 f5d21050-4e75-4d7c-967c-1ec4563f6aa1 00:30:27.961 22:37:38 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:27.961 22:37:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:28.528 22:37:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:28.787 [ 00:30:28.787 { 00:30:28.787 "name": "f5d21050-4e75-4d7c-967c-1ec4563f6aa1", 00:30:28.787 "aliases": [ 00:30:28.787 "lvs0/lv0" 00:30:28.787 ], 00:30:28.787 "product_name": "Logical Volume", 00:30:28.787 "block_size": 512, 00:30:28.787 "num_blocks": 204800, 00:30:28.787 "uuid": "f5d21050-4e75-4d7c-967c-1ec4563f6aa1", 00:30:28.787 "assigned_rate_limits": { 00:30:28.787 "rw_ios_per_sec": 0, 00:30:28.787 "rw_mbytes_per_sec": 0, 00:30:28.787 "r_mbytes_per_sec": 0, 00:30:28.787 "w_mbytes_per_sec": 0 00:30:28.787 }, 00:30:28.787 "claimed": false, 00:30:28.787 "zoned": false, 00:30:28.787 "supported_io_types": { 00:30:28.787 "read": true, 00:30:28.787 "write": true, 00:30:28.787 "unmap": true, 00:30:28.787 "flush": false, 00:30:28.787 "reset": true, 00:30:28.787 "nvme_admin": false, 00:30:28.787 "nvme_io": false, 00:30:28.787 "nvme_io_md": false, 00:30:28.787 "write_zeroes": true, 00:30:28.787 "zcopy": false, 00:30:28.787 "get_zone_info": false, 00:30:28.787 "zone_management": false, 00:30:28.787 "zone_append": false, 00:30:28.787 "compare": false, 00:30:28.787 "compare_and_write": false, 00:30:28.787 "abort": false, 00:30:28.787 "seek_hole": true, 00:30:28.787 "seek_data": true, 00:30:28.787 "copy": false, 00:30:28.787 "nvme_iov_md": false 00:30:28.787 }, 00:30:28.787 "driver_specific": { 00:30:28.787 "lvol": { 00:30:28.787 "lvol_store_uuid": "9e2cbb0e-a63e-4f44-97b7-c23baa90a680", 00:30:28.787 "base_bdev": "Nvme0n1", 00:30:28.787 "thin_provision": true, 00:30:28.787 "num_allocated_clusters": 0, 00:30:28.787 "snapshot": false, 00:30:28.787 "clone": false, 00:30:28.787 "esnap_clone": false 00:30:28.787 } 00:30:28.787 } 00:30:28.787 } 00:30:28.787 ] 00:30:28.787 22:37:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:28.787 22:37:38 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:28.787 22:37:38 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:29.046 [2024-07-12 22:37:39.164664] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:29.046 COMP_lvs0/lv0 00:30:29.046 22:37:39 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:29.046 22:37:39 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:29.615 22:37:39 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:29.615 [ 00:30:29.615 { 00:30:29.615 "name": "COMP_lvs0/lv0", 00:30:29.615 "aliases": [ 00:30:29.615 "84461b35-322b-5653-b022-a6f800d91c16" 00:30:29.615 ], 00:30:29.615 "product_name": "compress", 00:30:29.615 "block_size": 512, 00:30:29.615 "num_blocks": 200704, 00:30:29.615 "uuid": "84461b35-322b-5653-b022-a6f800d91c16", 00:30:29.615 "assigned_rate_limits": { 00:30:29.615 "rw_ios_per_sec": 0, 00:30:29.615 "rw_mbytes_per_sec": 0, 00:30:29.615 "r_mbytes_per_sec": 0, 00:30:29.615 "w_mbytes_per_sec": 0 00:30:29.615 }, 00:30:29.615 "claimed": false, 00:30:29.615 "zoned": false, 00:30:29.615 "supported_io_types": { 00:30:29.615 "read": true, 00:30:29.615 "write": true, 00:30:29.615 "unmap": false, 00:30:29.615 "flush": false, 00:30:29.615 "reset": false, 00:30:29.615 "nvme_admin": false, 00:30:29.615 "nvme_io": false, 00:30:29.615 "nvme_io_md": false, 00:30:29.615 "write_zeroes": true, 00:30:29.615 "zcopy": false, 00:30:29.615 "get_zone_info": false, 00:30:29.615 "zone_management": false, 00:30:29.615 "zone_append": false, 00:30:29.615 "compare": false, 00:30:29.615 "compare_and_write": false, 00:30:29.615 "abort": false, 00:30:29.615 "seek_hole": false, 00:30:29.615 "seek_data": false, 00:30:29.615 "copy": false, 00:30:29.615 "nvme_iov_md": false 00:30:29.615 }, 00:30:29.615 "driver_specific": { 00:30:29.615 "compress": { 00:30:29.615 "name": "COMP_lvs0/lv0", 00:30:29.615 "base_bdev_name": "f5d21050-4e75-4d7c-967c-1ec4563f6aa1" 00:30:29.615 } 00:30:29.615 } 00:30:29.615 } 00:30:29.615 ] 00:30:29.615 22:37:39 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:29.615 22:37:39 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:29.874 [2024-07-12 22:37:40.047888] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ffbe01b15c0 PMD being used: compress_qat 00:30:29.874 [2024-07-12 22:37:40.050051] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2113670 PMD being used: compress_qat 00:30:29.874 Running I/O for 3 seconds... 00:30:33.163 00:30:33.163 Latency(us) 00:30:33.163 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:33.163 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:33.163 Verification LBA range: start 0x0 length 0x3100 00:30:33.163 COMP_lvs0/lv0 : 3.00 5145.30 20.10 0.00 0.00 6166.98 505.77 5841.25 00:30:33.163 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:33.163 Verification LBA range: start 0x3100 length 0x3100 00:30:33.163 COMP_lvs0/lv0 : 3.00 5416.48 21.16 0.00 0.00 5871.58 447.00 5812.76 00:30:33.163 =================================================================================================================== 00:30:33.163 Total : 10561.77 41.26 0.00 0.00 6015.50 447.00 5841.25 00:30:33.163 0 00:30:33.163 22:37:43 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:33.163 22:37:43 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:33.421 22:37:43 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:33.679 22:37:43 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:33.679 22:37:43 compress_compdev -- compress/compress.sh@78 -- # killprocess 3588243 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3588243 ']' 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3588243 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3588243 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3588243' 00:30:33.679 killing process with pid 3588243 00:30:33.679 22:37:43 compress_compdev -- common/autotest_common.sh@967 -- # kill 3588243 00:30:33.679 Received shutdown signal, test time was about 3.000000 seconds 00:30:33.679 00:30:33.679 Latency(us) 00:30:33.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:33.680 =================================================================================================================== 00:30:33.680 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:33.680 22:37:43 compress_compdev -- common/autotest_common.sh@972 -- # wait 3588243 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3590114 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3590114 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3590114 ']' 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:36.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:36.969 22:37:46 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:36.969 22:37:46 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:36.969 [2024-07-12 22:37:46.954588] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:36.969 [2024-07-12 22:37:46.954648] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3590114 ] 00:30:36.969 [2024-07-12 22:37:47.056056] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:36.969 [2024-07-12 22:37:47.163597] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:36.969 [2024-07-12 22:37:47.163601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:37.905 [2024-07-12 22:37:47.913161] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:37.905 22:37:48 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:37.905 22:37:48 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:37.905 22:37:48 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:30:37.905 22:37:48 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:37.905 22:37:48 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:38.472 [2024-07-12 22:37:48.735222] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19883c0 PMD being used: compress_qat 00:30:38.472 22:37:48 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:38.472 22:37:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:38.731 22:37:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:38.990 [ 00:30:38.990 { 00:30:38.990 "name": "Nvme0n1", 00:30:38.990 "aliases": [ 00:30:38.990 "01000000-0000-0000-5cd2-e43197705251" 00:30:38.990 ], 00:30:38.990 "product_name": "NVMe disk", 00:30:38.990 "block_size": 512, 00:30:38.990 "num_blocks": 15002931888, 00:30:38.990 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:38.990 "assigned_rate_limits": { 00:30:38.990 "rw_ios_per_sec": 0, 00:30:38.990 "rw_mbytes_per_sec": 0, 00:30:38.990 "r_mbytes_per_sec": 0, 00:30:38.990 "w_mbytes_per_sec": 0 00:30:38.990 }, 00:30:38.990 "claimed": false, 00:30:38.990 "zoned": false, 00:30:38.990 "supported_io_types": { 00:30:38.990 "read": true, 00:30:38.990 "write": true, 00:30:38.990 "unmap": true, 00:30:38.990 "flush": true, 00:30:38.990 "reset": true, 00:30:38.990 "nvme_admin": true, 00:30:38.990 "nvme_io": true, 00:30:38.990 "nvme_io_md": false, 00:30:38.990 "write_zeroes": true, 00:30:38.990 "zcopy": false, 00:30:38.990 "get_zone_info": false, 00:30:38.990 "zone_management": false, 00:30:38.990 "zone_append": false, 00:30:38.990 "compare": false, 00:30:38.990 "compare_and_write": false, 00:30:38.990 "abort": true, 00:30:38.990 "seek_hole": false, 00:30:38.990 "seek_data": false, 00:30:38.990 "copy": false, 00:30:38.990 "nvme_iov_md": false 00:30:38.990 }, 00:30:38.990 "driver_specific": { 00:30:38.990 "nvme": [ 00:30:38.990 { 00:30:38.990 "pci_address": "0000:5e:00.0", 00:30:38.990 "trid": { 00:30:38.990 "trtype": "PCIe", 00:30:38.990 "traddr": "0000:5e:00.0" 00:30:38.990 }, 00:30:38.990 "ctrlr_data": { 00:30:38.990 "cntlid": 0, 00:30:38.990 "vendor_id": "0x8086", 00:30:38.990 "model_number": "INTEL SSDPF2KX076TZO", 00:30:38.990 "serial_number": "PHAC0301002G7P6CGN", 00:30:38.990 "firmware_revision": "JCV10200", 00:30:38.990 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:38.990 "oacs": { 00:30:38.990 "security": 1, 00:30:38.990 "format": 1, 00:30:38.990 "firmware": 1, 00:30:38.990 "ns_manage": 1 00:30:38.990 }, 00:30:38.990 "multi_ctrlr": false, 00:30:38.990 "ana_reporting": false 00:30:38.990 }, 00:30:38.990 "vs": { 00:30:38.990 "nvme_version": "1.3" 00:30:38.990 }, 00:30:38.990 "ns_data": { 00:30:38.990 "id": 1, 00:30:38.990 "can_share": false 00:30:38.990 }, 00:30:38.990 "security": { 00:30:38.990 "opal": true 00:30:38.990 } 00:30:38.990 } 00:30:38.990 ], 00:30:38.990 "mp_policy": "active_passive" 00:30:38.990 } 00:30:38.990 } 00:30:38.990 ] 00:30:38.990 22:37:49 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:38.990 22:37:49 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:39.558 [2024-07-12 22:37:49.745592] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17ed0d0 PMD being used: compress_qat 00:30:42.159 eb2b60b2-ffdd-4130-b1d5-233bd2c4f457 00:30:42.159 22:37:51 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:42.159 e53b2023-6f29-4a02-a5ad-9f9f9f3e3f7f 00:30:42.159 22:37:52 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:42.159 22:37:52 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:42.419 22:37:52 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:42.678 [ 00:30:42.678 { 00:30:42.678 "name": "e53b2023-6f29-4a02-a5ad-9f9f9f3e3f7f", 00:30:42.678 "aliases": [ 00:30:42.678 "lvs0/lv0" 00:30:42.678 ], 00:30:42.678 "product_name": "Logical Volume", 00:30:42.678 "block_size": 512, 00:30:42.678 "num_blocks": 204800, 00:30:42.678 "uuid": "e53b2023-6f29-4a02-a5ad-9f9f9f3e3f7f", 00:30:42.678 "assigned_rate_limits": { 00:30:42.678 "rw_ios_per_sec": 0, 00:30:42.678 "rw_mbytes_per_sec": 0, 00:30:42.678 "r_mbytes_per_sec": 0, 00:30:42.678 "w_mbytes_per_sec": 0 00:30:42.678 }, 00:30:42.678 "claimed": false, 00:30:42.678 "zoned": false, 00:30:42.678 "supported_io_types": { 00:30:42.678 "read": true, 00:30:42.678 "write": true, 00:30:42.678 "unmap": true, 00:30:42.678 "flush": false, 00:30:42.678 "reset": true, 00:30:42.678 "nvme_admin": false, 00:30:42.678 "nvme_io": false, 00:30:42.678 "nvme_io_md": false, 00:30:42.678 "write_zeroes": true, 00:30:42.678 "zcopy": false, 00:30:42.678 "get_zone_info": false, 00:30:42.678 "zone_management": false, 00:30:42.678 "zone_append": false, 00:30:42.678 "compare": false, 00:30:42.678 "compare_and_write": false, 00:30:42.678 "abort": false, 00:30:42.678 "seek_hole": true, 00:30:42.678 "seek_data": true, 00:30:42.678 "copy": false, 00:30:42.678 "nvme_iov_md": false 00:30:42.678 }, 00:30:42.678 "driver_specific": { 00:30:42.678 "lvol": { 00:30:42.678 "lvol_store_uuid": "eb2b60b2-ffdd-4130-b1d5-233bd2c4f457", 00:30:42.678 "base_bdev": "Nvme0n1", 00:30:42.678 "thin_provision": true, 00:30:42.678 "num_allocated_clusters": 0, 00:30:42.678 "snapshot": false, 00:30:42.678 "clone": false, 00:30:42.678 "esnap_clone": false 00:30:42.678 } 00:30:42.678 } 00:30:42.678 } 00:30:42.678 ] 00:30:42.678 22:37:52 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:42.678 22:37:52 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:42.678 22:37:52 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:42.937 [2024-07-12 22:37:53.205110] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:42.937 COMP_lvs0/lv0 00:30:42.937 22:37:53 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:42.937 22:37:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:43.507 22:37:53 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:43.766 [ 00:30:43.766 { 00:30:43.766 "name": "COMP_lvs0/lv0", 00:30:43.766 "aliases": [ 00:30:43.766 "846966d8-eca8-531b-b5b7-b1b2b9eeb0dd" 00:30:43.766 ], 00:30:43.766 "product_name": "compress", 00:30:43.766 "block_size": 512, 00:30:43.766 "num_blocks": 200704, 00:30:43.766 "uuid": "846966d8-eca8-531b-b5b7-b1b2b9eeb0dd", 00:30:43.766 "assigned_rate_limits": { 00:30:43.766 "rw_ios_per_sec": 0, 00:30:43.766 "rw_mbytes_per_sec": 0, 00:30:43.766 "r_mbytes_per_sec": 0, 00:30:43.766 "w_mbytes_per_sec": 0 00:30:43.766 }, 00:30:43.766 "claimed": false, 00:30:43.766 "zoned": false, 00:30:43.766 "supported_io_types": { 00:30:43.766 "read": true, 00:30:43.766 "write": true, 00:30:43.766 "unmap": false, 00:30:43.766 "flush": false, 00:30:43.766 "reset": false, 00:30:43.766 "nvme_admin": false, 00:30:43.766 "nvme_io": false, 00:30:43.766 "nvme_io_md": false, 00:30:43.766 "write_zeroes": true, 00:30:43.766 "zcopy": false, 00:30:43.766 "get_zone_info": false, 00:30:43.766 "zone_management": false, 00:30:43.766 "zone_append": false, 00:30:43.766 "compare": false, 00:30:43.766 "compare_and_write": false, 00:30:43.766 "abort": false, 00:30:43.766 "seek_hole": false, 00:30:43.766 "seek_data": false, 00:30:43.766 "copy": false, 00:30:43.766 "nvme_iov_md": false 00:30:43.766 }, 00:30:43.766 "driver_specific": { 00:30:43.767 "compress": { 00:30:43.767 "name": "COMP_lvs0/lv0", 00:30:43.767 "base_bdev_name": "e53b2023-6f29-4a02-a5ad-9f9f9f3e3f7f" 00:30:43.767 } 00:30:43.767 } 00:30:43.767 } 00:30:43.767 ] 00:30:43.767 22:37:53 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:43.767 22:37:53 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:44.026 [2024-07-12 22:37:54.216638] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fcbc41b15c0 PMD being used: compress_qat 00:30:44.026 [2024-07-12 22:37:54.218848] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1985700 PMD being used: compress_qat 00:30:44.026 Running I/O for 3 seconds... 00:30:47.317 00:30:47.317 Latency(us) 00:30:47.317 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.317 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:47.317 Verification LBA range: start 0x0 length 0x3100 00:30:47.317 COMP_lvs0/lv0 : 3.00 5129.97 20.04 0.00 0.00 6186.70 552.07 5755.77 00:30:47.317 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:47.317 Verification LBA range: start 0x3100 length 0x3100 00:30:47.317 COMP_lvs0/lv0 : 3.00 5407.55 21.12 0.00 0.00 5880.80 356.17 5641.79 00:30:47.317 =================================================================================================================== 00:30:47.317 Total : 10537.52 41.16 0.00 0.00 6029.72 356.17 5755.77 00:30:47.317 0 00:30:47.317 22:37:57 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:47.318 22:37:57 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:47.318 22:37:57 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:47.884 22:37:58 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:47.884 22:37:58 compress_compdev -- compress/compress.sh@78 -- # killprocess 3590114 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3590114 ']' 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3590114 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3590114 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3590114' 00:30:47.884 killing process with pid 3590114 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@967 -- # kill 3590114 00:30:47.884 Received shutdown signal, test time was about 3.000000 seconds 00:30:47.884 00:30:47.884 Latency(us) 00:30:47.884 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.884 =================================================================================================================== 00:30:47.884 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:47.884 22:37:58 compress_compdev -- common/autotest_common.sh@972 -- # wait 3590114 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3591894 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:51.172 22:38:01 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3591894 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3591894 ']' 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:51.172 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:51.172 22:38:01 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:51.172 [2024-07-12 22:38:01.115415] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:30:51.172 [2024-07-12 22:38:01.115487] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3591894 ] 00:30:51.172 [2024-07-12 22:38:01.234732] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:51.172 [2024-07-12 22:38:01.336654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:51.172 [2024-07-12 22:38:01.336660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:52.107 [2024-07-12 22:38:02.076912] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:52.107 22:38:02 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:52.107 22:38:02 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:52.107 22:38:02 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:52.107 22:38:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:52.107 22:38:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:52.365 [2024-07-12 22:38:02.655112] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2b0b3c0 PMD being used: compress_qat 00:30:52.365 22:38:02 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:52.365 22:38:02 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:52.629 22:38:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:52.888 [ 00:30:52.888 { 00:30:52.888 "name": "Nvme0n1", 00:30:52.888 "aliases": [ 00:30:52.888 "01000000-0000-0000-5cd2-e43197705251" 00:30:52.888 ], 00:30:52.888 "product_name": "NVMe disk", 00:30:52.888 "block_size": 512, 00:30:52.888 "num_blocks": 15002931888, 00:30:52.888 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:52.888 "assigned_rate_limits": { 00:30:52.888 "rw_ios_per_sec": 0, 00:30:52.888 "rw_mbytes_per_sec": 0, 00:30:52.888 "r_mbytes_per_sec": 0, 00:30:52.888 "w_mbytes_per_sec": 0 00:30:52.888 }, 00:30:52.888 "claimed": false, 00:30:52.888 "zoned": false, 00:30:52.888 "supported_io_types": { 00:30:52.888 "read": true, 00:30:52.888 "write": true, 00:30:52.888 "unmap": true, 00:30:52.888 "flush": true, 00:30:52.888 "reset": true, 00:30:52.888 "nvme_admin": true, 00:30:52.888 "nvme_io": true, 00:30:52.888 "nvme_io_md": false, 00:30:52.888 "write_zeroes": true, 00:30:52.888 "zcopy": false, 00:30:52.888 "get_zone_info": false, 00:30:52.888 "zone_management": false, 00:30:52.888 "zone_append": false, 00:30:52.888 "compare": false, 00:30:52.888 "compare_and_write": false, 00:30:52.888 "abort": true, 00:30:52.888 "seek_hole": false, 00:30:52.888 "seek_data": false, 00:30:52.888 "copy": false, 00:30:52.888 "nvme_iov_md": false 00:30:52.888 }, 00:30:52.888 "driver_specific": { 00:30:52.888 "nvme": [ 00:30:52.888 { 00:30:52.888 "pci_address": "0000:5e:00.0", 00:30:52.888 "trid": { 00:30:52.888 "trtype": "PCIe", 00:30:52.888 "traddr": "0000:5e:00.0" 00:30:52.888 }, 00:30:52.888 "ctrlr_data": { 00:30:52.888 "cntlid": 0, 00:30:52.888 "vendor_id": "0x8086", 00:30:52.888 "model_number": "INTEL SSDPF2KX076TZO", 00:30:52.888 "serial_number": "PHAC0301002G7P6CGN", 00:30:52.888 "firmware_revision": "JCV10200", 00:30:52.888 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:52.888 "oacs": { 00:30:52.888 "security": 1, 00:30:52.888 "format": 1, 00:30:52.888 "firmware": 1, 00:30:52.888 "ns_manage": 1 00:30:52.888 }, 00:30:52.888 "multi_ctrlr": false, 00:30:52.888 "ana_reporting": false 00:30:52.888 }, 00:30:52.888 "vs": { 00:30:52.888 "nvme_version": "1.3" 00:30:52.888 }, 00:30:52.888 "ns_data": { 00:30:52.888 "id": 1, 00:30:52.888 "can_share": false 00:30:52.888 }, 00:30:52.888 "security": { 00:30:52.888 "opal": true 00:30:52.888 } 00:30:52.888 } 00:30:52.888 ], 00:30:52.888 "mp_policy": "active_passive" 00:30:52.888 } 00:30:52.888 } 00:30:52.888 ] 00:30:52.888 22:38:03 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:52.888 22:38:03 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:53.146 [2024-07-12 22:38:03.348604] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2970660 PMD being used: compress_qat 00:30:55.681 9658a9a4-92f1-4aca-8e30-20ae70d6f94f 00:30:55.681 22:38:05 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:55.681 d38e7b23-0db7-4b88-9368-5d5a6d66fb86 00:30:55.681 22:38:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:55.681 22:38:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:55.939 22:38:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:55.939 [ 00:30:55.939 { 00:30:55.939 "name": "d38e7b23-0db7-4b88-9368-5d5a6d66fb86", 00:30:55.939 "aliases": [ 00:30:55.939 "lvs0/lv0" 00:30:55.939 ], 00:30:55.939 "product_name": "Logical Volume", 00:30:55.939 "block_size": 512, 00:30:55.939 "num_blocks": 204800, 00:30:55.939 "uuid": "d38e7b23-0db7-4b88-9368-5d5a6d66fb86", 00:30:55.939 "assigned_rate_limits": { 00:30:55.939 "rw_ios_per_sec": 0, 00:30:55.939 "rw_mbytes_per_sec": 0, 00:30:55.939 "r_mbytes_per_sec": 0, 00:30:55.939 "w_mbytes_per_sec": 0 00:30:55.939 }, 00:30:55.940 "claimed": false, 00:30:55.940 "zoned": false, 00:30:55.940 "supported_io_types": { 00:30:55.940 "read": true, 00:30:55.940 "write": true, 00:30:55.940 "unmap": true, 00:30:55.940 "flush": false, 00:30:55.940 "reset": true, 00:30:55.940 "nvme_admin": false, 00:30:55.940 "nvme_io": false, 00:30:55.940 "nvme_io_md": false, 00:30:55.940 "write_zeroes": true, 00:30:55.940 "zcopy": false, 00:30:55.940 "get_zone_info": false, 00:30:55.940 "zone_management": false, 00:30:55.940 "zone_append": false, 00:30:55.940 "compare": false, 00:30:55.940 "compare_and_write": false, 00:30:55.940 "abort": false, 00:30:55.940 "seek_hole": true, 00:30:55.940 "seek_data": true, 00:30:55.940 "copy": false, 00:30:55.940 "nvme_iov_md": false 00:30:55.940 }, 00:30:55.940 "driver_specific": { 00:30:55.940 "lvol": { 00:30:55.940 "lvol_store_uuid": "9658a9a4-92f1-4aca-8e30-20ae70d6f94f", 00:30:55.940 "base_bdev": "Nvme0n1", 00:30:55.940 "thin_provision": true, 00:30:55.940 "num_allocated_clusters": 0, 00:30:55.940 "snapshot": false, 00:30:55.940 "clone": false, 00:30:55.940 "esnap_clone": false 00:30:55.940 } 00:30:55.940 } 00:30:55.940 } 00:30:55.940 ] 00:30:55.940 22:38:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:55.940 22:38:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:55.940 22:38:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:56.198 [2024-07-12 22:38:06.487085] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:56.198 COMP_lvs0/lv0 00:30:56.198 22:38:06 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:56.198 22:38:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.457 22:38:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:56.716 [ 00:30:56.716 { 00:30:56.716 "name": "COMP_lvs0/lv0", 00:30:56.716 "aliases": [ 00:30:56.716 "d42ba209-eb9a-54f3-91f1-f16df51bedf8" 00:30:56.716 ], 00:30:56.716 "product_name": "compress", 00:30:56.716 "block_size": 4096, 00:30:56.716 "num_blocks": 25088, 00:30:56.716 "uuid": "d42ba209-eb9a-54f3-91f1-f16df51bedf8", 00:30:56.716 "assigned_rate_limits": { 00:30:56.716 "rw_ios_per_sec": 0, 00:30:56.716 "rw_mbytes_per_sec": 0, 00:30:56.716 "r_mbytes_per_sec": 0, 00:30:56.716 "w_mbytes_per_sec": 0 00:30:56.716 }, 00:30:56.716 "claimed": false, 00:30:56.716 "zoned": false, 00:30:56.716 "supported_io_types": { 00:30:56.716 "read": true, 00:30:56.716 "write": true, 00:30:56.716 "unmap": false, 00:30:56.716 "flush": false, 00:30:56.716 "reset": false, 00:30:56.716 "nvme_admin": false, 00:30:56.716 "nvme_io": false, 00:30:56.716 "nvme_io_md": false, 00:30:56.716 "write_zeroes": true, 00:30:56.716 "zcopy": false, 00:30:56.716 "get_zone_info": false, 00:30:56.716 "zone_management": false, 00:30:56.716 "zone_append": false, 00:30:56.716 "compare": false, 00:30:56.716 "compare_and_write": false, 00:30:56.716 "abort": false, 00:30:56.716 "seek_hole": false, 00:30:56.716 "seek_data": false, 00:30:56.716 "copy": false, 00:30:56.716 "nvme_iov_md": false 00:30:56.716 }, 00:30:56.716 "driver_specific": { 00:30:56.716 "compress": { 00:30:56.716 "name": "COMP_lvs0/lv0", 00:30:56.716 "base_bdev_name": "d38e7b23-0db7-4b88-9368-5d5a6d66fb86" 00:30:56.716 } 00:30:56.716 } 00:30:56.716 } 00:30:56.716 ] 00:30:56.716 22:38:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:56.716 22:38:06 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:56.975 [2024-07-12 22:38:07.101395] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa5501b15c0 PMD being used: compress_qat 00:30:56.975 [2024-07-12 22:38:07.103608] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2b08770 PMD being used: compress_qat 00:30:56.975 Running I/O for 3 seconds... 00:31:00.278 00:31:00.278 Latency(us) 00:31:00.278 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:00.278 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:00.278 Verification LBA range: start 0x0 length 0x3100 00:31:00.278 COMP_lvs0/lv0 : 3.00 5158.33 20.15 0.00 0.00 6153.09 505.77 6211.67 00:31:00.278 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:00.278 Verification LBA range: start 0x3100 length 0x3100 00:31:00.278 COMP_lvs0/lv0 : 3.00 5406.70 21.12 0.00 0.00 5882.69 357.95 5727.28 00:31:00.278 =================================================================================================================== 00:31:00.278 Total : 10565.02 41.27 0.00 0.00 6014.71 357.95 6211.67 00:31:00.278 0 00:31:00.278 22:38:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:00.278 22:38:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:00.278 22:38:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:00.278 22:38:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:00.278 22:38:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 3591894 00:31:00.278 22:38:10 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3591894 ']' 00:31:00.278 22:38:10 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3591894 00:31:00.278 22:38:10 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:00.278 22:38:10 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:00.278 22:38:10 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3591894 00:31:00.537 22:38:10 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:00.537 22:38:10 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:00.537 22:38:10 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3591894' 00:31:00.537 killing process with pid 3591894 00:31:00.537 22:38:10 compress_compdev -- common/autotest_common.sh@967 -- # kill 3591894 00:31:00.537 Received shutdown signal, test time was about 3.000000 seconds 00:31:00.537 00:31:00.537 Latency(us) 00:31:00.537 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:00.537 =================================================================================================================== 00:31:00.537 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:00.537 22:38:10 compress_compdev -- common/autotest_common.sh@972 -- # wait 3591894 00:31:03.827 22:38:13 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:31:03.827 22:38:13 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:03.827 22:38:13 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=3593493 00:31:03.828 22:38:13 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:03.828 22:38:13 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:31:03.828 22:38:13 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 3593493 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 3593493 ']' 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:03.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:03.828 22:38:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:03.828 [2024-07-12 22:38:13.687683] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:31:03.828 [2024-07-12 22:38:13.687759] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3593493 ] 00:31:03.828 [2024-07-12 22:38:13.820148] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:03.828 [2024-07-12 22:38:13.918876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:03.828 [2024-07-12 22:38:13.918963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:03.828 [2024-07-12 22:38:13.918971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:04.395 [2024-07-12 22:38:14.666808] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:04.654 22:38:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:04.654 22:38:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:04.654 22:38:14 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:31:04.654 22:38:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:04.654 22:38:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:05.221 [2024-07-12 22:38:15.239481] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14b8f20 PMD being used: compress_qat 00:31:05.221 22:38:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:05.221 22:38:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:05.480 [ 00:31:05.480 { 00:31:05.480 "name": "Nvme0n1", 00:31:05.480 "aliases": [ 00:31:05.480 "01000000-0000-0000-5cd2-e43197705251" 00:31:05.480 ], 00:31:05.480 "product_name": "NVMe disk", 00:31:05.480 "block_size": 512, 00:31:05.480 "num_blocks": 15002931888, 00:31:05.480 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:05.480 "assigned_rate_limits": { 00:31:05.480 "rw_ios_per_sec": 0, 00:31:05.480 "rw_mbytes_per_sec": 0, 00:31:05.480 "r_mbytes_per_sec": 0, 00:31:05.480 "w_mbytes_per_sec": 0 00:31:05.480 }, 00:31:05.480 "claimed": false, 00:31:05.481 "zoned": false, 00:31:05.481 "supported_io_types": { 00:31:05.481 "read": true, 00:31:05.481 "write": true, 00:31:05.481 "unmap": true, 00:31:05.481 "flush": true, 00:31:05.481 "reset": true, 00:31:05.481 "nvme_admin": true, 00:31:05.481 "nvme_io": true, 00:31:05.481 "nvme_io_md": false, 00:31:05.481 "write_zeroes": true, 00:31:05.481 "zcopy": false, 00:31:05.481 "get_zone_info": false, 00:31:05.481 "zone_management": false, 00:31:05.481 "zone_append": false, 00:31:05.481 "compare": false, 00:31:05.481 "compare_and_write": false, 00:31:05.481 "abort": true, 00:31:05.481 "seek_hole": false, 00:31:05.481 "seek_data": false, 00:31:05.481 "copy": false, 00:31:05.481 "nvme_iov_md": false 00:31:05.481 }, 00:31:05.481 "driver_specific": { 00:31:05.481 "nvme": [ 00:31:05.481 { 00:31:05.481 "pci_address": "0000:5e:00.0", 00:31:05.481 "trid": { 00:31:05.481 "trtype": "PCIe", 00:31:05.481 "traddr": "0000:5e:00.0" 00:31:05.481 }, 00:31:05.481 "ctrlr_data": { 00:31:05.481 "cntlid": 0, 00:31:05.481 "vendor_id": "0x8086", 00:31:05.481 "model_number": "INTEL SSDPF2KX076TZO", 00:31:05.481 "serial_number": "PHAC0301002G7P6CGN", 00:31:05.481 "firmware_revision": "JCV10200", 00:31:05.481 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:05.481 "oacs": { 00:31:05.481 "security": 1, 00:31:05.481 "format": 1, 00:31:05.481 "firmware": 1, 00:31:05.481 "ns_manage": 1 00:31:05.481 }, 00:31:05.481 "multi_ctrlr": false, 00:31:05.481 "ana_reporting": false 00:31:05.481 }, 00:31:05.481 "vs": { 00:31:05.481 "nvme_version": "1.3" 00:31:05.481 }, 00:31:05.481 "ns_data": { 00:31:05.481 "id": 1, 00:31:05.481 "can_share": false 00:31:05.481 }, 00:31:05.481 "security": { 00:31:05.481 "opal": true 00:31:05.481 } 00:31:05.481 } 00:31:05.481 ], 00:31:05.481 "mp_policy": "active_passive" 00:31:05.481 } 00:31:05.481 } 00:31:05.481 ] 00:31:05.481 22:38:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:05.481 22:38:15 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:05.807 [2024-07-12 22:38:15.984868] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1307390 PMD being used: compress_qat 00:31:08.387 cea38ada-0ea0-4a4e-83ff-e89060148fa6 00:31:08.387 22:38:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:08.387 f4bb98e4-9962-4d57-9845-f328fb0e4a62 00:31:08.387 22:38:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:08.387 22:38:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:08.646 [ 00:31:08.646 { 00:31:08.646 "name": "f4bb98e4-9962-4d57-9845-f328fb0e4a62", 00:31:08.646 "aliases": [ 00:31:08.647 "lvs0/lv0" 00:31:08.647 ], 00:31:08.647 "product_name": "Logical Volume", 00:31:08.647 "block_size": 512, 00:31:08.647 "num_blocks": 204800, 00:31:08.647 "uuid": "f4bb98e4-9962-4d57-9845-f328fb0e4a62", 00:31:08.647 "assigned_rate_limits": { 00:31:08.647 "rw_ios_per_sec": 0, 00:31:08.647 "rw_mbytes_per_sec": 0, 00:31:08.647 "r_mbytes_per_sec": 0, 00:31:08.647 "w_mbytes_per_sec": 0 00:31:08.647 }, 00:31:08.647 "claimed": false, 00:31:08.647 "zoned": false, 00:31:08.647 "supported_io_types": { 00:31:08.647 "read": true, 00:31:08.647 "write": true, 00:31:08.647 "unmap": true, 00:31:08.647 "flush": false, 00:31:08.647 "reset": true, 00:31:08.647 "nvme_admin": false, 00:31:08.647 "nvme_io": false, 00:31:08.647 "nvme_io_md": false, 00:31:08.647 "write_zeroes": true, 00:31:08.647 "zcopy": false, 00:31:08.647 "get_zone_info": false, 00:31:08.647 "zone_management": false, 00:31:08.647 "zone_append": false, 00:31:08.647 "compare": false, 00:31:08.647 "compare_and_write": false, 00:31:08.647 "abort": false, 00:31:08.647 "seek_hole": true, 00:31:08.647 "seek_data": true, 00:31:08.647 "copy": false, 00:31:08.647 "nvme_iov_md": false 00:31:08.647 }, 00:31:08.647 "driver_specific": { 00:31:08.647 "lvol": { 00:31:08.647 "lvol_store_uuid": "cea38ada-0ea0-4a4e-83ff-e89060148fa6", 00:31:08.647 "base_bdev": "Nvme0n1", 00:31:08.647 "thin_provision": true, 00:31:08.647 "num_allocated_clusters": 0, 00:31:08.647 "snapshot": false, 00:31:08.647 "clone": false, 00:31:08.647 "esnap_clone": false 00:31:08.647 } 00:31:08.647 } 00:31:08.647 } 00:31:08.647 ] 00:31:08.647 22:38:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:08.647 22:38:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:08.647 22:38:18 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:08.906 [2024-07-12 22:38:19.164773] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:08.906 COMP_lvs0/lv0 00:31:08.906 22:38:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:08.906 22:38:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:09.165 22:38:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:09.425 [ 00:31:09.425 { 00:31:09.425 "name": "COMP_lvs0/lv0", 00:31:09.425 "aliases": [ 00:31:09.425 "1e83c3b4-0a37-5d7d-b63a-aff6dc68c67f" 00:31:09.425 ], 00:31:09.425 "product_name": "compress", 00:31:09.425 "block_size": 512, 00:31:09.425 "num_blocks": 200704, 00:31:09.425 "uuid": "1e83c3b4-0a37-5d7d-b63a-aff6dc68c67f", 00:31:09.425 "assigned_rate_limits": { 00:31:09.425 "rw_ios_per_sec": 0, 00:31:09.425 "rw_mbytes_per_sec": 0, 00:31:09.425 "r_mbytes_per_sec": 0, 00:31:09.425 "w_mbytes_per_sec": 0 00:31:09.425 }, 00:31:09.425 "claimed": false, 00:31:09.425 "zoned": false, 00:31:09.425 "supported_io_types": { 00:31:09.425 "read": true, 00:31:09.425 "write": true, 00:31:09.425 "unmap": false, 00:31:09.425 "flush": false, 00:31:09.425 "reset": false, 00:31:09.425 "nvme_admin": false, 00:31:09.425 "nvme_io": false, 00:31:09.425 "nvme_io_md": false, 00:31:09.426 "write_zeroes": true, 00:31:09.426 "zcopy": false, 00:31:09.426 "get_zone_info": false, 00:31:09.426 "zone_management": false, 00:31:09.426 "zone_append": false, 00:31:09.426 "compare": false, 00:31:09.426 "compare_and_write": false, 00:31:09.426 "abort": false, 00:31:09.426 "seek_hole": false, 00:31:09.426 "seek_data": false, 00:31:09.426 "copy": false, 00:31:09.426 "nvme_iov_md": false 00:31:09.426 }, 00:31:09.426 "driver_specific": { 00:31:09.426 "compress": { 00:31:09.426 "name": "COMP_lvs0/lv0", 00:31:09.426 "base_bdev_name": "f4bb98e4-9962-4d57-9845-f328fb0e4a62" 00:31:09.426 } 00:31:09.426 } 00:31:09.426 } 00:31:09.426 ] 00:31:09.426 22:38:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:09.426 22:38:19 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:09.686 [2024-07-12 22:38:19.765699] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8b441b1350 PMD being used: compress_qat 00:31:09.687 I/O targets: 00:31:09.687 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:09.687 00:31:09.687 00:31:09.687 CUnit - A unit testing framework for C - Version 2.1-3 00:31:09.687 http://cunit.sourceforge.net/ 00:31:09.687 00:31:09.687 00:31:09.687 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:09.687 Test: blockdev write read block ...passed 00:31:09.687 Test: blockdev write zeroes read block ...passed 00:31:09.687 Test: blockdev write zeroes read no split ...passed 00:31:09.687 Test: blockdev write zeroes read split ...passed 00:31:09.687 Test: blockdev write zeroes read split partial ...passed 00:31:09.687 Test: blockdev reset ...[2024-07-12 22:38:19.803220] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:09.687 passed 00:31:09.687 Test: blockdev write read 8 blocks ...passed 00:31:09.687 Test: blockdev write read size > 128k ...passed 00:31:09.687 Test: blockdev write read invalid size ...passed 00:31:09.687 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:09.687 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:09.687 Test: blockdev write read max offset ...passed 00:31:09.687 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:09.687 Test: blockdev writev readv 8 blocks ...passed 00:31:09.687 Test: blockdev writev readv 30 x 1block ...passed 00:31:09.687 Test: blockdev writev readv block ...passed 00:31:09.687 Test: blockdev writev readv size > 128k ...passed 00:31:09.687 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:09.687 Test: blockdev comparev and writev ...passed 00:31:09.687 Test: blockdev nvme passthru rw ...passed 00:31:09.687 Test: blockdev nvme passthru vendor specific ...passed 00:31:09.687 Test: blockdev nvme admin passthru ...passed 00:31:09.687 Test: blockdev copy ...passed 00:31:09.687 00:31:09.687 Run Summary: Type Total Ran Passed Failed Inactive 00:31:09.687 suites 1 1 n/a 0 0 00:31:09.687 tests 23 23 23 0 0 00:31:09.687 asserts 130 130 130 0 n/a 00:31:09.687 00:31:09.687 Elapsed time = 0.091 seconds 00:31:09.687 0 00:31:09.687 22:38:19 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:09.687 22:38:19 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:09.946 22:38:20 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:10.205 22:38:20 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:10.205 22:38:20 compress_compdev -- compress/compress.sh@62 -- # killprocess 3593493 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 3593493 ']' 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 3593493 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3593493 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3593493' 00:31:10.205 killing process with pid 3593493 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@967 -- # kill 3593493 00:31:10.205 22:38:20 compress_compdev -- common/autotest_common.sh@972 -- # wait 3593493 00:31:13.510 22:38:23 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:13.510 22:38:23 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:13.510 00:31:13.510 real 0m50.729s 00:31:13.510 user 1m58.367s 00:31:13.510 sys 0m6.123s 00:31:13.510 22:38:23 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:13.510 22:38:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:13.510 ************************************ 00:31:13.510 END TEST compress_compdev 00:31:13.510 ************************************ 00:31:13.510 22:38:23 -- common/autotest_common.sh@1142 -- # return 0 00:31:13.510 22:38:23 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:13.510 22:38:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:13.510 22:38:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:13.510 22:38:23 -- common/autotest_common.sh@10 -- # set +x 00:31:13.510 ************************************ 00:31:13.510 START TEST compress_isal 00:31:13.510 ************************************ 00:31:13.510 22:38:23 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:13.510 * Looking for test storage... 00:31:13.510 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:13.510 22:38:23 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:13.510 22:38:23 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:13.510 22:38:23 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:13.510 22:38:23 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:13.510 22:38:23 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:13.510 22:38:23 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:13.510 22:38:23 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:13.511 22:38:23 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:13.511 22:38:23 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:13.511 22:38:23 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:13.511 22:38:23 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3594795 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3594795 00:31:13.511 22:38:23 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3594795 ']' 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:13.511 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:13.511 22:38:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:13.511 [2024-07-12 22:38:23.658320] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:31:13.511 [2024-07-12 22:38:23.658397] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3594795 ] 00:31:13.511 [2024-07-12 22:38:23.781345] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:13.770 [2024-07-12 22:38:23.884628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:13.770 [2024-07-12 22:38:23.884633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:14.338 22:38:24 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:14.338 22:38:24 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:14.338 22:38:24 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:14.338 22:38:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:14.338 22:38:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:14.906 22:38:25 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:14.906 22:38:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:15.165 22:38:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:15.424 [ 00:31:15.424 { 00:31:15.424 "name": "Nvme0n1", 00:31:15.424 "aliases": [ 00:31:15.424 "01000000-0000-0000-5cd2-e43197705251" 00:31:15.424 ], 00:31:15.424 "product_name": "NVMe disk", 00:31:15.424 "block_size": 512, 00:31:15.424 "num_blocks": 15002931888, 00:31:15.424 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:15.424 "assigned_rate_limits": { 00:31:15.424 "rw_ios_per_sec": 0, 00:31:15.424 "rw_mbytes_per_sec": 0, 00:31:15.424 "r_mbytes_per_sec": 0, 00:31:15.424 "w_mbytes_per_sec": 0 00:31:15.424 }, 00:31:15.424 "claimed": false, 00:31:15.424 "zoned": false, 00:31:15.424 "supported_io_types": { 00:31:15.424 "read": true, 00:31:15.424 "write": true, 00:31:15.424 "unmap": true, 00:31:15.424 "flush": true, 00:31:15.424 "reset": true, 00:31:15.424 "nvme_admin": true, 00:31:15.424 "nvme_io": true, 00:31:15.424 "nvme_io_md": false, 00:31:15.424 "write_zeroes": true, 00:31:15.424 "zcopy": false, 00:31:15.424 "get_zone_info": false, 00:31:15.424 "zone_management": false, 00:31:15.424 "zone_append": false, 00:31:15.424 "compare": false, 00:31:15.424 "compare_and_write": false, 00:31:15.424 "abort": true, 00:31:15.424 "seek_hole": false, 00:31:15.424 "seek_data": false, 00:31:15.424 "copy": false, 00:31:15.424 "nvme_iov_md": false 00:31:15.424 }, 00:31:15.424 "driver_specific": { 00:31:15.424 "nvme": [ 00:31:15.424 { 00:31:15.424 "pci_address": "0000:5e:00.0", 00:31:15.424 "trid": { 00:31:15.424 "trtype": "PCIe", 00:31:15.424 "traddr": "0000:5e:00.0" 00:31:15.424 }, 00:31:15.424 "ctrlr_data": { 00:31:15.424 "cntlid": 0, 00:31:15.424 "vendor_id": "0x8086", 00:31:15.424 "model_number": "INTEL SSDPF2KX076TZO", 00:31:15.424 "serial_number": "PHAC0301002G7P6CGN", 00:31:15.424 "firmware_revision": "JCV10200", 00:31:15.424 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:15.424 "oacs": { 00:31:15.424 "security": 1, 00:31:15.424 "format": 1, 00:31:15.424 "firmware": 1, 00:31:15.424 "ns_manage": 1 00:31:15.424 }, 00:31:15.424 "multi_ctrlr": false, 00:31:15.424 "ana_reporting": false 00:31:15.424 }, 00:31:15.424 "vs": { 00:31:15.424 "nvme_version": "1.3" 00:31:15.424 }, 00:31:15.424 "ns_data": { 00:31:15.424 "id": 1, 00:31:15.424 "can_share": false 00:31:15.424 }, 00:31:15.424 "security": { 00:31:15.424 "opal": true 00:31:15.424 } 00:31:15.424 } 00:31:15.424 ], 00:31:15.424 "mp_policy": "active_passive" 00:31:15.424 } 00:31:15.424 } 00:31:15.424 ] 00:31:15.424 22:38:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:15.424 22:38:25 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:17.959 79294159-f4f0-4bd2-b0b4-5acaf37a4460 00:31:17.959 22:38:28 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:18.217 0c9673d7-6bdd-4147-a127-58dc0d2a5263 00:31:18.217 22:38:28 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:18.217 22:38:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:18.476 22:38:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:18.735 [ 00:31:18.735 { 00:31:18.735 "name": "0c9673d7-6bdd-4147-a127-58dc0d2a5263", 00:31:18.735 "aliases": [ 00:31:18.735 "lvs0/lv0" 00:31:18.735 ], 00:31:18.735 "product_name": "Logical Volume", 00:31:18.735 "block_size": 512, 00:31:18.735 "num_blocks": 204800, 00:31:18.735 "uuid": "0c9673d7-6bdd-4147-a127-58dc0d2a5263", 00:31:18.735 "assigned_rate_limits": { 00:31:18.735 "rw_ios_per_sec": 0, 00:31:18.735 "rw_mbytes_per_sec": 0, 00:31:18.735 "r_mbytes_per_sec": 0, 00:31:18.735 "w_mbytes_per_sec": 0 00:31:18.735 }, 00:31:18.735 "claimed": false, 00:31:18.735 "zoned": false, 00:31:18.735 "supported_io_types": { 00:31:18.735 "read": true, 00:31:18.735 "write": true, 00:31:18.735 "unmap": true, 00:31:18.735 "flush": false, 00:31:18.735 "reset": true, 00:31:18.735 "nvme_admin": false, 00:31:18.735 "nvme_io": false, 00:31:18.735 "nvme_io_md": false, 00:31:18.735 "write_zeroes": true, 00:31:18.735 "zcopy": false, 00:31:18.735 "get_zone_info": false, 00:31:18.735 "zone_management": false, 00:31:18.735 "zone_append": false, 00:31:18.735 "compare": false, 00:31:18.735 "compare_and_write": false, 00:31:18.735 "abort": false, 00:31:18.735 "seek_hole": true, 00:31:18.735 "seek_data": true, 00:31:18.735 "copy": false, 00:31:18.735 "nvme_iov_md": false 00:31:18.735 }, 00:31:18.735 "driver_specific": { 00:31:18.735 "lvol": { 00:31:18.735 "lvol_store_uuid": "79294159-f4f0-4bd2-b0b4-5acaf37a4460", 00:31:18.735 "base_bdev": "Nvme0n1", 00:31:18.735 "thin_provision": true, 00:31:18.735 "num_allocated_clusters": 0, 00:31:18.735 "snapshot": false, 00:31:18.735 "clone": false, 00:31:18.735 "esnap_clone": false 00:31:18.735 } 00:31:18.735 } 00:31:18.735 } 00:31:18.735 ] 00:31:18.735 22:38:28 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:18.735 22:38:28 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:18.735 22:38:28 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:18.994 [2024-07-12 22:38:29.098796] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:18.994 COMP_lvs0/lv0 00:31:18.994 22:38:29 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:18.994 22:38:29 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:19.254 22:38:29 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:19.254 [ 00:31:19.254 { 00:31:19.254 "name": "COMP_lvs0/lv0", 00:31:19.254 "aliases": [ 00:31:19.254 "b77dce68-d789-520d-ad90-99431a3492b1" 00:31:19.254 ], 00:31:19.254 "product_name": "compress", 00:31:19.254 "block_size": 512, 00:31:19.254 "num_blocks": 200704, 00:31:19.254 "uuid": "b77dce68-d789-520d-ad90-99431a3492b1", 00:31:19.254 "assigned_rate_limits": { 00:31:19.254 "rw_ios_per_sec": 0, 00:31:19.254 "rw_mbytes_per_sec": 0, 00:31:19.254 "r_mbytes_per_sec": 0, 00:31:19.254 "w_mbytes_per_sec": 0 00:31:19.254 }, 00:31:19.254 "claimed": false, 00:31:19.254 "zoned": false, 00:31:19.254 "supported_io_types": { 00:31:19.254 "read": true, 00:31:19.254 "write": true, 00:31:19.254 "unmap": false, 00:31:19.254 "flush": false, 00:31:19.254 "reset": false, 00:31:19.254 "nvme_admin": false, 00:31:19.254 "nvme_io": false, 00:31:19.254 "nvme_io_md": false, 00:31:19.254 "write_zeroes": true, 00:31:19.254 "zcopy": false, 00:31:19.254 "get_zone_info": false, 00:31:19.254 "zone_management": false, 00:31:19.254 "zone_append": false, 00:31:19.254 "compare": false, 00:31:19.254 "compare_and_write": false, 00:31:19.254 "abort": false, 00:31:19.254 "seek_hole": false, 00:31:19.254 "seek_data": false, 00:31:19.254 "copy": false, 00:31:19.254 "nvme_iov_md": false 00:31:19.254 }, 00:31:19.254 "driver_specific": { 00:31:19.254 "compress": { 00:31:19.254 "name": "COMP_lvs0/lv0", 00:31:19.254 "base_bdev_name": "0c9673d7-6bdd-4147-a127-58dc0d2a5263" 00:31:19.254 } 00:31:19.254 } 00:31:19.254 } 00:31:19.254 ] 00:31:19.254 22:38:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:19.254 22:38:29 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:19.514 Running I/O for 3 seconds... 00:31:22.805 00:31:22.805 Latency(us) 00:31:22.805 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:22.805 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:22.805 Verification LBA range: start 0x0 length 0x3100 00:31:22.805 COMP_lvs0/lv0 : 3.01 2897.64 11.32 0.00 0.00 10997.22 648.24 10827.69 00:31:22.805 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:22.805 Verification LBA range: start 0x3100 length 0x3100 00:31:22.805 COMP_lvs0/lv0 : 3.01 2891.72 11.30 0.00 0.00 11023.31 1068.52 11112.63 00:31:22.805 =================================================================================================================== 00:31:22.805 Total : 5789.36 22.61 0.00 0.00 11010.25 648.24 11112.63 00:31:22.805 0 00:31:22.805 22:38:32 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:22.805 22:38:32 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:22.805 22:38:32 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:23.063 22:38:33 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:23.063 22:38:33 compress_isal -- compress/compress.sh@78 -- # killprocess 3594795 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3594795 ']' 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3594795 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3594795 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3594795' 00:31:23.063 killing process with pid 3594795 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@967 -- # kill 3594795 00:31:23.063 Received shutdown signal, test time was about 3.000000 seconds 00:31:23.063 00:31:23.063 Latency(us) 00:31:23.063 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:23.063 =================================================================================================================== 00:31:23.063 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:23.063 22:38:33 compress_isal -- common/autotest_common.sh@972 -- # wait 3594795 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3596436 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:26.354 22:38:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3596436 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3596436 ']' 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:26.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:26.354 22:38:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:26.354 [2024-07-12 22:38:36.263708] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:31:26.354 [2024-07-12 22:38:36.263766] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3596436 ] 00:31:26.354 [2024-07-12 22:38:36.366186] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:26.354 [2024-07-12 22:38:36.473747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:26.354 [2024-07-12 22:38:36.473753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:26.922 22:38:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:26.922 22:38:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:26.922 22:38:37 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:26.922 22:38:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:26.922 22:38:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:27.489 22:38:37 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:27.489 22:38:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:27.749 22:38:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:28.009 [ 00:31:28.009 { 00:31:28.009 "name": "Nvme0n1", 00:31:28.009 "aliases": [ 00:31:28.009 "01000000-0000-0000-5cd2-e43197705251" 00:31:28.009 ], 00:31:28.009 "product_name": "NVMe disk", 00:31:28.009 "block_size": 512, 00:31:28.009 "num_blocks": 15002931888, 00:31:28.009 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:28.009 "assigned_rate_limits": { 00:31:28.009 "rw_ios_per_sec": 0, 00:31:28.009 "rw_mbytes_per_sec": 0, 00:31:28.009 "r_mbytes_per_sec": 0, 00:31:28.009 "w_mbytes_per_sec": 0 00:31:28.009 }, 00:31:28.009 "claimed": false, 00:31:28.009 "zoned": false, 00:31:28.009 "supported_io_types": { 00:31:28.009 "read": true, 00:31:28.009 "write": true, 00:31:28.009 "unmap": true, 00:31:28.009 "flush": true, 00:31:28.009 "reset": true, 00:31:28.009 "nvme_admin": true, 00:31:28.009 "nvme_io": true, 00:31:28.009 "nvme_io_md": false, 00:31:28.009 "write_zeroes": true, 00:31:28.009 "zcopy": false, 00:31:28.009 "get_zone_info": false, 00:31:28.009 "zone_management": false, 00:31:28.009 "zone_append": false, 00:31:28.009 "compare": false, 00:31:28.009 "compare_and_write": false, 00:31:28.009 "abort": true, 00:31:28.009 "seek_hole": false, 00:31:28.009 "seek_data": false, 00:31:28.009 "copy": false, 00:31:28.009 "nvme_iov_md": false 00:31:28.009 }, 00:31:28.009 "driver_specific": { 00:31:28.009 "nvme": [ 00:31:28.009 { 00:31:28.009 "pci_address": "0000:5e:00.0", 00:31:28.009 "trid": { 00:31:28.009 "trtype": "PCIe", 00:31:28.009 "traddr": "0000:5e:00.0" 00:31:28.009 }, 00:31:28.009 "ctrlr_data": { 00:31:28.009 "cntlid": 0, 00:31:28.009 "vendor_id": "0x8086", 00:31:28.009 "model_number": "INTEL SSDPF2KX076TZO", 00:31:28.009 "serial_number": "PHAC0301002G7P6CGN", 00:31:28.009 "firmware_revision": "JCV10200", 00:31:28.009 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:28.009 "oacs": { 00:31:28.009 "security": 1, 00:31:28.009 "format": 1, 00:31:28.009 "firmware": 1, 00:31:28.009 "ns_manage": 1 00:31:28.009 }, 00:31:28.009 "multi_ctrlr": false, 00:31:28.009 "ana_reporting": false 00:31:28.009 }, 00:31:28.009 "vs": { 00:31:28.009 "nvme_version": "1.3" 00:31:28.009 }, 00:31:28.009 "ns_data": { 00:31:28.009 "id": 1, 00:31:28.009 "can_share": false 00:31:28.009 }, 00:31:28.009 "security": { 00:31:28.009 "opal": true 00:31:28.009 } 00:31:28.009 } 00:31:28.009 ], 00:31:28.009 "mp_policy": "active_passive" 00:31:28.009 } 00:31:28.009 } 00:31:28.009 ] 00:31:28.009 22:38:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:28.009 22:38:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:30.601 8bc8db30-7322-4774-bca0-c3605c330e94 00:31:30.601 22:38:40 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:30.859 d15e7cd9-677b-4c10-86db-92cf7870a7e9 00:31:30.859 22:38:40 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:30.859 22:38:40 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:31.133 22:38:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:31.133 [ 00:31:31.133 { 00:31:31.133 "name": "d15e7cd9-677b-4c10-86db-92cf7870a7e9", 00:31:31.133 "aliases": [ 00:31:31.133 "lvs0/lv0" 00:31:31.133 ], 00:31:31.133 "product_name": "Logical Volume", 00:31:31.133 "block_size": 512, 00:31:31.133 "num_blocks": 204800, 00:31:31.133 "uuid": "d15e7cd9-677b-4c10-86db-92cf7870a7e9", 00:31:31.133 "assigned_rate_limits": { 00:31:31.133 "rw_ios_per_sec": 0, 00:31:31.133 "rw_mbytes_per_sec": 0, 00:31:31.133 "r_mbytes_per_sec": 0, 00:31:31.133 "w_mbytes_per_sec": 0 00:31:31.133 }, 00:31:31.133 "claimed": false, 00:31:31.133 "zoned": false, 00:31:31.133 "supported_io_types": { 00:31:31.133 "read": true, 00:31:31.133 "write": true, 00:31:31.133 "unmap": true, 00:31:31.133 "flush": false, 00:31:31.133 "reset": true, 00:31:31.133 "nvme_admin": false, 00:31:31.133 "nvme_io": false, 00:31:31.133 "nvme_io_md": false, 00:31:31.133 "write_zeroes": true, 00:31:31.133 "zcopy": false, 00:31:31.133 "get_zone_info": false, 00:31:31.133 "zone_management": false, 00:31:31.133 "zone_append": false, 00:31:31.133 "compare": false, 00:31:31.133 "compare_and_write": false, 00:31:31.133 "abort": false, 00:31:31.133 "seek_hole": true, 00:31:31.133 "seek_data": true, 00:31:31.133 "copy": false, 00:31:31.133 "nvme_iov_md": false 00:31:31.133 }, 00:31:31.133 "driver_specific": { 00:31:31.133 "lvol": { 00:31:31.133 "lvol_store_uuid": "8bc8db30-7322-4774-bca0-c3605c330e94", 00:31:31.133 "base_bdev": "Nvme0n1", 00:31:31.133 "thin_provision": true, 00:31:31.133 "num_allocated_clusters": 0, 00:31:31.133 "snapshot": false, 00:31:31.133 "clone": false, 00:31:31.133 "esnap_clone": false 00:31:31.133 } 00:31:31.133 } 00:31:31.133 } 00:31:31.133 ] 00:31:31.392 22:38:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:31.392 22:38:41 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:31.392 22:38:41 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:31.392 [2024-07-12 22:38:41.693008] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:31.392 COMP_lvs0/lv0 00:31:31.392 22:38:41 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:31.392 22:38:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:31.392 22:38:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:31.393 22:38:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:31.393 22:38:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:31.393 22:38:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:31.393 22:38:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:31.651 22:38:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:31.910 [ 00:31:31.910 { 00:31:31.910 "name": "COMP_lvs0/lv0", 00:31:31.910 "aliases": [ 00:31:31.910 "17f86bb5-911a-5feb-86a3-126f4aaa1482" 00:31:31.910 ], 00:31:31.910 "product_name": "compress", 00:31:31.910 "block_size": 512, 00:31:31.910 "num_blocks": 200704, 00:31:31.910 "uuid": "17f86bb5-911a-5feb-86a3-126f4aaa1482", 00:31:31.910 "assigned_rate_limits": { 00:31:31.910 "rw_ios_per_sec": 0, 00:31:31.910 "rw_mbytes_per_sec": 0, 00:31:31.910 "r_mbytes_per_sec": 0, 00:31:31.910 "w_mbytes_per_sec": 0 00:31:31.910 }, 00:31:31.910 "claimed": false, 00:31:31.910 "zoned": false, 00:31:31.910 "supported_io_types": { 00:31:31.910 "read": true, 00:31:31.910 "write": true, 00:31:31.910 "unmap": false, 00:31:31.910 "flush": false, 00:31:31.910 "reset": false, 00:31:31.910 "nvme_admin": false, 00:31:31.910 "nvme_io": false, 00:31:31.910 "nvme_io_md": false, 00:31:31.910 "write_zeroes": true, 00:31:31.910 "zcopy": false, 00:31:31.910 "get_zone_info": false, 00:31:31.910 "zone_management": false, 00:31:31.910 "zone_append": false, 00:31:31.910 "compare": false, 00:31:31.910 "compare_and_write": false, 00:31:31.910 "abort": false, 00:31:31.910 "seek_hole": false, 00:31:31.910 "seek_data": false, 00:31:31.910 "copy": false, 00:31:31.910 "nvme_iov_md": false 00:31:31.910 }, 00:31:31.910 "driver_specific": { 00:31:31.910 "compress": { 00:31:31.910 "name": "COMP_lvs0/lv0", 00:31:31.910 "base_bdev_name": "d15e7cd9-677b-4c10-86db-92cf7870a7e9" 00:31:31.910 } 00:31:31.910 } 00:31:31.910 } 00:31:31.910 ] 00:31:31.910 22:38:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:31.910 22:38:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:32.168 Running I/O for 3 seconds... 00:31:35.458 00:31:35.458 Latency(us) 00:31:35.458 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:35.458 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:35.458 Verification LBA range: start 0x0 length 0x3100 00:31:35.458 COMP_lvs0/lv0 : 3.01 3886.11 15.18 0.00 0.00 8176.71 690.98 7180.47 00:31:35.458 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:35.458 Verification LBA range: start 0x3100 length 0x3100 00:31:35.458 COMP_lvs0/lv0 : 3.00 3890.04 15.20 0.00 0.00 8182.38 544.95 7066.49 00:31:35.458 =================================================================================================================== 00:31:35.458 Total : 7776.15 30.38 0.00 0.00 8179.54 544.95 7180.47 00:31:35.458 0 00:31:35.458 22:38:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:35.458 22:38:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:35.458 22:38:45 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:35.717 22:38:45 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:35.717 22:38:45 compress_isal -- compress/compress.sh@78 -- # killprocess 3596436 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3596436 ']' 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3596436 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3596436 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3596436' 00:31:35.717 killing process with pid 3596436 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@967 -- # kill 3596436 00:31:35.717 Received shutdown signal, test time was about 3.000000 seconds 00:31:35.717 00:31:35.717 Latency(us) 00:31:35.717 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:35.717 =================================================================================================================== 00:31:35.717 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:35.717 22:38:45 compress_isal -- common/autotest_common.sh@972 -- # wait 3596436 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3598114 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:39.004 22:38:48 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3598114 00:31:39.004 22:38:48 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3598114 ']' 00:31:39.004 22:38:48 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:39.004 22:38:48 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:39.004 22:38:48 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:39.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:39.004 22:38:48 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:39.005 22:38:48 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:39.005 [2024-07-12 22:38:48.938847] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:31:39.005 [2024-07-12 22:38:48.938916] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3598114 ] 00:31:39.005 [2024-07-12 22:38:49.058075] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:39.005 [2024-07-12 22:38:49.164584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:39.005 [2024-07-12 22:38:49.164590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:39.572 22:38:49 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:39.572 22:38:49 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:39.572 22:38:49 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:39.572 22:38:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:39.572 22:38:49 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:40.139 22:38:50 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:40.140 22:38:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:40.399 22:38:50 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:40.657 [ 00:31:40.657 { 00:31:40.657 "name": "Nvme0n1", 00:31:40.657 "aliases": [ 00:31:40.657 "01000000-0000-0000-5cd2-e43197705251" 00:31:40.657 ], 00:31:40.657 "product_name": "NVMe disk", 00:31:40.657 "block_size": 512, 00:31:40.657 "num_blocks": 15002931888, 00:31:40.657 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:40.657 "assigned_rate_limits": { 00:31:40.657 "rw_ios_per_sec": 0, 00:31:40.657 "rw_mbytes_per_sec": 0, 00:31:40.657 "r_mbytes_per_sec": 0, 00:31:40.657 "w_mbytes_per_sec": 0 00:31:40.657 }, 00:31:40.657 "claimed": false, 00:31:40.657 "zoned": false, 00:31:40.657 "supported_io_types": { 00:31:40.657 "read": true, 00:31:40.657 "write": true, 00:31:40.657 "unmap": true, 00:31:40.657 "flush": true, 00:31:40.657 "reset": true, 00:31:40.657 "nvme_admin": true, 00:31:40.657 "nvme_io": true, 00:31:40.657 "nvme_io_md": false, 00:31:40.657 "write_zeroes": true, 00:31:40.657 "zcopy": false, 00:31:40.657 "get_zone_info": false, 00:31:40.657 "zone_management": false, 00:31:40.657 "zone_append": false, 00:31:40.657 "compare": false, 00:31:40.657 "compare_and_write": false, 00:31:40.657 "abort": true, 00:31:40.657 "seek_hole": false, 00:31:40.657 "seek_data": false, 00:31:40.657 "copy": false, 00:31:40.657 "nvme_iov_md": false 00:31:40.657 }, 00:31:40.657 "driver_specific": { 00:31:40.657 "nvme": [ 00:31:40.657 { 00:31:40.657 "pci_address": "0000:5e:00.0", 00:31:40.657 "trid": { 00:31:40.658 "trtype": "PCIe", 00:31:40.658 "traddr": "0000:5e:00.0" 00:31:40.658 }, 00:31:40.658 "ctrlr_data": { 00:31:40.658 "cntlid": 0, 00:31:40.658 "vendor_id": "0x8086", 00:31:40.658 "model_number": "INTEL SSDPF2KX076TZO", 00:31:40.658 "serial_number": "PHAC0301002G7P6CGN", 00:31:40.658 "firmware_revision": "JCV10200", 00:31:40.658 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:40.658 "oacs": { 00:31:40.658 "security": 1, 00:31:40.658 "format": 1, 00:31:40.658 "firmware": 1, 00:31:40.658 "ns_manage": 1 00:31:40.658 }, 00:31:40.658 "multi_ctrlr": false, 00:31:40.658 "ana_reporting": false 00:31:40.658 }, 00:31:40.658 "vs": { 00:31:40.658 "nvme_version": "1.3" 00:31:40.658 }, 00:31:40.658 "ns_data": { 00:31:40.658 "id": 1, 00:31:40.658 "can_share": false 00:31:40.658 }, 00:31:40.658 "security": { 00:31:40.658 "opal": true 00:31:40.658 } 00:31:40.658 } 00:31:40.658 ], 00:31:40.658 "mp_policy": "active_passive" 00:31:40.658 } 00:31:40.658 } 00:31:40.658 ] 00:31:40.658 22:38:50 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:40.658 22:38:50 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:43.193 47d5321e-8042-4002-ae6a-1670d60f8d6e 00:31:43.193 22:38:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:43.193 057fb3c6-3acb-4f2d-ad80-2aee5399b1a5 00:31:43.193 22:38:53 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:43.193 22:38:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:43.452 22:38:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:43.711 [ 00:31:43.711 { 00:31:43.711 "name": "057fb3c6-3acb-4f2d-ad80-2aee5399b1a5", 00:31:43.711 "aliases": [ 00:31:43.711 "lvs0/lv0" 00:31:43.711 ], 00:31:43.711 "product_name": "Logical Volume", 00:31:43.711 "block_size": 512, 00:31:43.711 "num_blocks": 204800, 00:31:43.711 "uuid": "057fb3c6-3acb-4f2d-ad80-2aee5399b1a5", 00:31:43.711 "assigned_rate_limits": { 00:31:43.711 "rw_ios_per_sec": 0, 00:31:43.711 "rw_mbytes_per_sec": 0, 00:31:43.711 "r_mbytes_per_sec": 0, 00:31:43.711 "w_mbytes_per_sec": 0 00:31:43.711 }, 00:31:43.711 "claimed": false, 00:31:43.711 "zoned": false, 00:31:43.711 "supported_io_types": { 00:31:43.711 "read": true, 00:31:43.711 "write": true, 00:31:43.711 "unmap": true, 00:31:43.711 "flush": false, 00:31:43.711 "reset": true, 00:31:43.711 "nvme_admin": false, 00:31:43.711 "nvme_io": false, 00:31:43.711 "nvme_io_md": false, 00:31:43.711 "write_zeroes": true, 00:31:43.711 "zcopy": false, 00:31:43.711 "get_zone_info": false, 00:31:43.711 "zone_management": false, 00:31:43.711 "zone_append": false, 00:31:43.711 "compare": false, 00:31:43.711 "compare_and_write": false, 00:31:43.711 "abort": false, 00:31:43.711 "seek_hole": true, 00:31:43.711 "seek_data": true, 00:31:43.711 "copy": false, 00:31:43.711 "nvme_iov_md": false 00:31:43.711 }, 00:31:43.711 "driver_specific": { 00:31:43.711 "lvol": { 00:31:43.711 "lvol_store_uuid": "47d5321e-8042-4002-ae6a-1670d60f8d6e", 00:31:43.711 "base_bdev": "Nvme0n1", 00:31:43.711 "thin_provision": true, 00:31:43.711 "num_allocated_clusters": 0, 00:31:43.711 "snapshot": false, 00:31:43.711 "clone": false, 00:31:43.711 "esnap_clone": false 00:31:43.711 } 00:31:43.711 } 00:31:43.711 } 00:31:43.711 ] 00:31:43.711 22:38:53 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:43.711 22:38:53 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:43.711 22:38:53 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:43.970 [2024-07-12 22:38:54.223211] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:43.970 COMP_lvs0/lv0 00:31:43.970 22:38:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:43.970 22:38:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:44.229 22:38:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:44.488 [ 00:31:44.488 { 00:31:44.488 "name": "COMP_lvs0/lv0", 00:31:44.488 "aliases": [ 00:31:44.488 "6f3e9e38-3970-55f2-8a11-d3312bc38c68" 00:31:44.488 ], 00:31:44.488 "product_name": "compress", 00:31:44.488 "block_size": 4096, 00:31:44.488 "num_blocks": 25088, 00:31:44.488 "uuid": "6f3e9e38-3970-55f2-8a11-d3312bc38c68", 00:31:44.488 "assigned_rate_limits": { 00:31:44.488 "rw_ios_per_sec": 0, 00:31:44.488 "rw_mbytes_per_sec": 0, 00:31:44.488 "r_mbytes_per_sec": 0, 00:31:44.488 "w_mbytes_per_sec": 0 00:31:44.488 }, 00:31:44.488 "claimed": false, 00:31:44.488 "zoned": false, 00:31:44.488 "supported_io_types": { 00:31:44.488 "read": true, 00:31:44.488 "write": true, 00:31:44.488 "unmap": false, 00:31:44.488 "flush": false, 00:31:44.488 "reset": false, 00:31:44.488 "nvme_admin": false, 00:31:44.488 "nvme_io": false, 00:31:44.488 "nvme_io_md": false, 00:31:44.488 "write_zeroes": true, 00:31:44.488 "zcopy": false, 00:31:44.488 "get_zone_info": false, 00:31:44.488 "zone_management": false, 00:31:44.488 "zone_append": false, 00:31:44.488 "compare": false, 00:31:44.488 "compare_and_write": false, 00:31:44.488 "abort": false, 00:31:44.488 "seek_hole": false, 00:31:44.488 "seek_data": false, 00:31:44.488 "copy": false, 00:31:44.488 "nvme_iov_md": false 00:31:44.488 }, 00:31:44.488 "driver_specific": { 00:31:44.488 "compress": { 00:31:44.489 "name": "COMP_lvs0/lv0", 00:31:44.489 "base_bdev_name": "057fb3c6-3acb-4f2d-ad80-2aee5399b1a5" 00:31:44.489 } 00:31:44.489 } 00:31:44.489 } 00:31:44.489 ] 00:31:44.489 22:38:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:44.489 22:38:54 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:44.489 Running I/O for 3 seconds... 00:31:47.777 00:31:47.777 Latency(us) 00:31:47.777 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:47.777 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:47.777 Verification LBA range: start 0x0 length 0x3100 00:31:47.777 COMP_lvs0/lv0 : 3.00 3925.04 15.33 0.00 0.00 8098.65 644.67 7978.30 00:31:47.777 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:47.777 Verification LBA range: start 0x3100 length 0x3100 00:31:47.777 COMP_lvs0/lv0 : 3.00 3930.25 15.35 0.00 0.00 8101.07 495.08 8092.27 00:31:47.777 =================================================================================================================== 00:31:47.777 Total : 7855.29 30.68 0.00 0.00 8099.86 495.08 8092.27 00:31:47.777 0 00:31:47.777 22:38:57 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:47.777 22:38:57 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:47.777 22:38:58 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:48.036 22:38:58 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:48.036 22:38:58 compress_isal -- compress/compress.sh@78 -- # killprocess 3598114 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3598114 ']' 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3598114 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3598114 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3598114' 00:31:48.036 killing process with pid 3598114 00:31:48.036 22:38:58 compress_isal -- common/autotest_common.sh@967 -- # kill 3598114 00:31:48.036 Received shutdown signal, test time was about 3.000000 seconds 00:31:48.036 00:31:48.036 Latency(us) 00:31:48.036 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:48.036 =================================================================================================================== 00:31:48.036 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:48.295 22:38:58 compress_isal -- common/autotest_common.sh@972 -- # wait 3598114 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=3599832 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:51.581 22:39:01 compress_isal -- compress/compress.sh@57 -- # waitforlisten 3599832 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 3599832 ']' 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:51.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:51.581 22:39:01 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:51.581 [2024-07-12 22:39:01.402112] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:31:51.581 [2024-07-12 22:39:01.402184] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3599832 ] 00:31:51.581 [2024-07-12 22:39:01.532339] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:51.581 [2024-07-12 22:39:01.644822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:51.581 [2024-07-12 22:39:01.644908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:51.581 [2024-07-12 22:39:01.644912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:52.149 22:39:02 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:52.149 22:39:02 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:52.149 22:39:02 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:52.149 22:39:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:52.149 22:39:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:52.731 22:39:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:52.731 22:39:02 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:52.989 22:39:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:53.250 [ 00:31:53.250 { 00:31:53.250 "name": "Nvme0n1", 00:31:53.250 "aliases": [ 00:31:53.250 "01000000-0000-0000-5cd2-e43197705251" 00:31:53.250 ], 00:31:53.250 "product_name": "NVMe disk", 00:31:53.250 "block_size": 512, 00:31:53.250 "num_blocks": 15002931888, 00:31:53.250 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:53.250 "assigned_rate_limits": { 00:31:53.250 "rw_ios_per_sec": 0, 00:31:53.250 "rw_mbytes_per_sec": 0, 00:31:53.250 "r_mbytes_per_sec": 0, 00:31:53.250 "w_mbytes_per_sec": 0 00:31:53.250 }, 00:31:53.250 "claimed": false, 00:31:53.250 "zoned": false, 00:31:53.250 "supported_io_types": { 00:31:53.250 "read": true, 00:31:53.250 "write": true, 00:31:53.250 "unmap": true, 00:31:53.250 "flush": true, 00:31:53.250 "reset": true, 00:31:53.250 "nvme_admin": true, 00:31:53.250 "nvme_io": true, 00:31:53.250 "nvme_io_md": false, 00:31:53.250 "write_zeroes": true, 00:31:53.250 "zcopy": false, 00:31:53.250 "get_zone_info": false, 00:31:53.250 "zone_management": false, 00:31:53.250 "zone_append": false, 00:31:53.250 "compare": false, 00:31:53.250 "compare_and_write": false, 00:31:53.250 "abort": true, 00:31:53.250 "seek_hole": false, 00:31:53.250 "seek_data": false, 00:31:53.250 "copy": false, 00:31:53.250 "nvme_iov_md": false 00:31:53.250 }, 00:31:53.250 "driver_specific": { 00:31:53.250 "nvme": [ 00:31:53.250 { 00:31:53.250 "pci_address": "0000:5e:00.0", 00:31:53.250 "trid": { 00:31:53.250 "trtype": "PCIe", 00:31:53.250 "traddr": "0000:5e:00.0" 00:31:53.250 }, 00:31:53.250 "ctrlr_data": { 00:31:53.250 "cntlid": 0, 00:31:53.250 "vendor_id": "0x8086", 00:31:53.250 "model_number": "INTEL SSDPF2KX076TZO", 00:31:53.250 "serial_number": "PHAC0301002G7P6CGN", 00:31:53.250 "firmware_revision": "JCV10200", 00:31:53.250 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:53.250 "oacs": { 00:31:53.250 "security": 1, 00:31:53.250 "format": 1, 00:31:53.250 "firmware": 1, 00:31:53.250 "ns_manage": 1 00:31:53.250 }, 00:31:53.250 "multi_ctrlr": false, 00:31:53.250 "ana_reporting": false 00:31:53.250 }, 00:31:53.250 "vs": { 00:31:53.250 "nvme_version": "1.3" 00:31:53.250 }, 00:31:53.250 "ns_data": { 00:31:53.250 "id": 1, 00:31:53.250 "can_share": false 00:31:53.250 }, 00:31:53.250 "security": { 00:31:53.250 "opal": true 00:31:53.250 } 00:31:53.250 } 00:31:53.250 ], 00:31:53.250 "mp_policy": "active_passive" 00:31:53.250 } 00:31:53.250 } 00:31:53.250 ] 00:31:53.250 22:39:03 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:53.250 22:39:03 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:55.873 7f54fa8c-9a37-49aa-80f5-03abf1e07c2b 00:31:55.873 22:39:05 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:55.873 fe77b8cf-5a03-4cb8-a1af-59f1e3eba1cb 00:31:55.873 22:39:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:55.873 22:39:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:56.132 22:39:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:56.391 [ 00:31:56.391 { 00:31:56.391 "name": "fe77b8cf-5a03-4cb8-a1af-59f1e3eba1cb", 00:31:56.391 "aliases": [ 00:31:56.391 "lvs0/lv0" 00:31:56.391 ], 00:31:56.391 "product_name": "Logical Volume", 00:31:56.391 "block_size": 512, 00:31:56.391 "num_blocks": 204800, 00:31:56.391 "uuid": "fe77b8cf-5a03-4cb8-a1af-59f1e3eba1cb", 00:31:56.391 "assigned_rate_limits": { 00:31:56.391 "rw_ios_per_sec": 0, 00:31:56.391 "rw_mbytes_per_sec": 0, 00:31:56.391 "r_mbytes_per_sec": 0, 00:31:56.391 "w_mbytes_per_sec": 0 00:31:56.391 }, 00:31:56.391 "claimed": false, 00:31:56.391 "zoned": false, 00:31:56.391 "supported_io_types": { 00:31:56.391 "read": true, 00:31:56.391 "write": true, 00:31:56.391 "unmap": true, 00:31:56.391 "flush": false, 00:31:56.391 "reset": true, 00:31:56.391 "nvme_admin": false, 00:31:56.391 "nvme_io": false, 00:31:56.391 "nvme_io_md": false, 00:31:56.391 "write_zeroes": true, 00:31:56.391 "zcopy": false, 00:31:56.391 "get_zone_info": false, 00:31:56.391 "zone_management": false, 00:31:56.391 "zone_append": false, 00:31:56.391 "compare": false, 00:31:56.391 "compare_and_write": false, 00:31:56.391 "abort": false, 00:31:56.392 "seek_hole": true, 00:31:56.392 "seek_data": true, 00:31:56.392 "copy": false, 00:31:56.392 "nvme_iov_md": false 00:31:56.392 }, 00:31:56.392 "driver_specific": { 00:31:56.392 "lvol": { 00:31:56.392 "lvol_store_uuid": "7f54fa8c-9a37-49aa-80f5-03abf1e07c2b", 00:31:56.392 "base_bdev": "Nvme0n1", 00:31:56.392 "thin_provision": true, 00:31:56.392 "num_allocated_clusters": 0, 00:31:56.392 "snapshot": false, 00:31:56.392 "clone": false, 00:31:56.392 "esnap_clone": false 00:31:56.392 } 00:31:56.392 } 00:31:56.392 } 00:31:56.392 ] 00:31:56.392 22:39:06 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:56.392 22:39:06 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:56.392 22:39:06 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:56.651 [2024-07-12 22:39:06.856400] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:56.651 COMP_lvs0/lv0 00:31:56.651 22:39:06 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:56.651 22:39:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:56.910 22:39:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:57.169 [ 00:31:57.169 { 00:31:57.169 "name": "COMP_lvs0/lv0", 00:31:57.169 "aliases": [ 00:31:57.169 "479a0117-c5e4-5f1c-8d1d-da53db5c1868" 00:31:57.169 ], 00:31:57.169 "product_name": "compress", 00:31:57.169 "block_size": 512, 00:31:57.169 "num_blocks": 200704, 00:31:57.169 "uuid": "479a0117-c5e4-5f1c-8d1d-da53db5c1868", 00:31:57.169 "assigned_rate_limits": { 00:31:57.169 "rw_ios_per_sec": 0, 00:31:57.169 "rw_mbytes_per_sec": 0, 00:31:57.169 "r_mbytes_per_sec": 0, 00:31:57.169 "w_mbytes_per_sec": 0 00:31:57.169 }, 00:31:57.169 "claimed": false, 00:31:57.169 "zoned": false, 00:31:57.169 "supported_io_types": { 00:31:57.169 "read": true, 00:31:57.169 "write": true, 00:31:57.169 "unmap": false, 00:31:57.169 "flush": false, 00:31:57.169 "reset": false, 00:31:57.169 "nvme_admin": false, 00:31:57.169 "nvme_io": false, 00:31:57.169 "nvme_io_md": false, 00:31:57.169 "write_zeroes": true, 00:31:57.169 "zcopy": false, 00:31:57.169 "get_zone_info": false, 00:31:57.169 "zone_management": false, 00:31:57.169 "zone_append": false, 00:31:57.169 "compare": false, 00:31:57.169 "compare_and_write": false, 00:31:57.169 "abort": false, 00:31:57.169 "seek_hole": false, 00:31:57.169 "seek_data": false, 00:31:57.169 "copy": false, 00:31:57.169 "nvme_iov_md": false 00:31:57.169 }, 00:31:57.169 "driver_specific": { 00:31:57.169 "compress": { 00:31:57.169 "name": "COMP_lvs0/lv0", 00:31:57.169 "base_bdev_name": "fe77b8cf-5a03-4cb8-a1af-59f1e3eba1cb" 00:31:57.169 } 00:31:57.169 } 00:31:57.169 } 00:31:57.169 ] 00:31:57.169 22:39:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:57.169 22:39:07 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:57.169 I/O targets: 00:31:57.169 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:57.169 00:31:57.169 00:31:57.169 CUnit - A unit testing framework for C - Version 2.1-3 00:31:57.170 http://cunit.sourceforge.net/ 00:31:57.170 00:31:57.170 00:31:57.170 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:57.170 Test: blockdev write read block ...passed 00:31:57.170 Test: blockdev write zeroes read block ...passed 00:31:57.170 Test: blockdev write zeroes read no split ...passed 00:31:57.170 Test: blockdev write zeroes read split ...passed 00:31:57.170 Test: blockdev write zeroes read split partial ...passed 00:31:57.170 Test: blockdev reset ...[2024-07-12 22:39:07.479353] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:57.170 passed 00:31:57.170 Test: blockdev write read 8 blocks ...passed 00:31:57.170 Test: blockdev write read size > 128k ...passed 00:31:57.170 Test: blockdev write read invalid size ...passed 00:31:57.170 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:57.170 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:57.170 Test: blockdev write read max offset ...passed 00:31:57.170 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:57.170 Test: blockdev writev readv 8 blocks ...passed 00:31:57.170 Test: blockdev writev readv 30 x 1block ...passed 00:31:57.170 Test: blockdev writev readv block ...passed 00:31:57.170 Test: blockdev writev readv size > 128k ...passed 00:31:57.170 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:57.170 Test: blockdev comparev and writev ...passed 00:31:57.170 Test: blockdev nvme passthru rw ...passed 00:31:57.170 Test: blockdev nvme passthru vendor specific ...passed 00:31:57.170 Test: blockdev nvme admin passthru ...passed 00:31:57.170 Test: blockdev copy ...passed 00:31:57.170 00:31:57.170 Run Summary: Type Total Ran Passed Failed Inactive 00:31:57.170 suites 1 1 n/a 0 0 00:31:57.170 tests 23 23 23 0 0 00:31:57.170 asserts 130 130 130 0 n/a 00:31:57.170 00:31:57.170 Elapsed time = 0.108 seconds 00:31:57.170 0 00:31:57.429 22:39:07 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:57.429 22:39:07 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:57.689 22:39:07 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:57.689 22:39:07 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:57.689 22:39:07 compress_isal -- compress/compress.sh@62 -- # killprocess 3599832 00:31:57.689 22:39:07 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 3599832 ']' 00:31:57.689 22:39:07 compress_isal -- common/autotest_common.sh@952 -- # kill -0 3599832 00:31:57.689 22:39:07 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:57.689 22:39:07 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:57.689 22:39:07 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3599832 00:31:57.689 22:39:08 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:57.689 22:39:08 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:57.689 22:39:08 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3599832' 00:31:57.689 killing process with pid 3599832 00:31:57.689 22:39:08 compress_isal -- common/autotest_common.sh@967 -- # kill 3599832 00:31:57.689 22:39:08 compress_isal -- common/autotest_common.sh@972 -- # wait 3599832 00:32:00.981 22:39:10 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:00.981 22:39:10 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:00.981 00:32:00.981 real 0m47.535s 00:32:00.981 user 1m51.305s 00:32:00.981 sys 0m4.154s 00:32:00.981 22:39:10 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:00.981 22:39:10 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:00.981 ************************************ 00:32:00.981 END TEST compress_isal 00:32:00.981 ************************************ 00:32:00.981 22:39:11 -- common/autotest_common.sh@1142 -- # return 0 00:32:00.981 22:39:11 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:32:00.981 22:39:11 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:32:00.981 22:39:11 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:00.981 22:39:11 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:00.981 22:39:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:00.981 22:39:11 -- common/autotest_common.sh@10 -- # set +x 00:32:00.981 ************************************ 00:32:00.981 START TEST blockdev_crypto_aesni 00:32:00.981 ************************************ 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:00.981 * Looking for test storage... 00:32:00.981 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3601561 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:00.981 22:39:11 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 3601561 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 3601561 ']' 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:00.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:00.981 22:39:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:00.981 [2024-07-12 22:39:11.276011] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:00.981 [2024-07-12 22:39:11.276087] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3601561 ] 00:32:01.241 [2024-07-12 22:39:11.404397] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:01.241 [2024-07-12 22:39:11.508159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:02.179 22:39:12 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:02.179 22:39:12 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:32:02.179 22:39:12 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:02.180 22:39:12 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:32:02.180 22:39:12 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:32:02.180 22:39:12 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:02.180 22:39:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:02.180 [2024-07-12 22:39:12.206368] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:02.180 [2024-07-12 22:39:12.214402] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:02.180 [2024-07-12 22:39:12.222416] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:02.180 [2024-07-12 22:39:12.294275] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:04.717 true 00:32:04.717 true 00:32:04.717 true 00:32:04.717 true 00:32:04.717 Malloc0 00:32:04.717 Malloc1 00:32:04.717 Malloc2 00:32:04.717 Malloc3 00:32:04.717 [2024-07-12 22:39:14.686215] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:04.717 crypto_ram 00:32:04.717 [2024-07-12 22:39:14.694233] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:04.717 crypto_ram2 00:32:04.717 [2024-07-12 22:39:14.702254] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:04.717 crypto_ram3 00:32:04.717 [2024-07-12 22:39:14.710274] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:04.717 crypto_ram4 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.717 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.717 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:32:04.717 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.717 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.717 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:04.717 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da2083d8-576c-5c7f-8cdc-629db3e3d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da2083d8-576c-5c7f-8cdc-629db3e3d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "98c7760f-ce2e-5499-b939-7314efcee72b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "98c7760f-ce2e-5499-b939-7314efcee72b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "34d7d990-3224-5150-9de9-9ce48a6a2a1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "34d7d990-3224-5150-9de9-9ce48a6a2a1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ef28ce93-2293-545d-9119-e876b42f45e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ef28ce93-2293-545d-9119-e876b42f45e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:04.718 22:39:14 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 3601561 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 3601561 ']' 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 3601561 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3601561 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3601561' 00:32:04.718 killing process with pid 3601561 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 3601561 00:32:04.718 22:39:14 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 3601561 00:32:05.288 22:39:15 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:05.288 22:39:15 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:05.288 22:39:15 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:05.288 22:39:15 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:05.288 22:39:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:05.288 ************************************ 00:32:05.288 START TEST bdev_hello_world 00:32:05.288 ************************************ 00:32:05.288 22:39:15 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:05.547 [2024-07-12 22:39:15.657854] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:05.547 [2024-07-12 22:39:15.657914] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3602102 ] 00:32:05.547 [2024-07-12 22:39:15.785117] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:05.806 [2024-07-12 22:39:15.887621] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:05.806 [2024-07-12 22:39:15.908900] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:05.806 [2024-07-12 22:39:15.916935] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:05.806 [2024-07-12 22:39:15.924959] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:05.806 [2024-07-12 22:39:16.030421] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:08.341 [2024-07-12 22:39:18.257695] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:08.341 [2024-07-12 22:39:18.257768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:08.341 [2024-07-12 22:39:18.257783] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.341 [2024-07-12 22:39:18.265715] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:08.341 [2024-07-12 22:39:18.265738] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:08.341 [2024-07-12 22:39:18.265751] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.341 [2024-07-12 22:39:18.273734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:08.341 [2024-07-12 22:39:18.273752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:08.341 [2024-07-12 22:39:18.273764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.341 [2024-07-12 22:39:18.281754] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:08.341 [2024-07-12 22:39:18.281771] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:08.341 [2024-07-12 22:39:18.281782] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.341 [2024-07-12 22:39:18.354370] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:08.341 [2024-07-12 22:39:18.354414] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:08.341 [2024-07-12 22:39:18.354432] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:08.341 [2024-07-12 22:39:18.355691] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:08.341 [2024-07-12 22:39:18.355764] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:08.341 [2024-07-12 22:39:18.355781] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:08.341 [2024-07-12 22:39:18.355825] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:08.341 00:32:08.341 [2024-07-12 22:39:18.355843] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:08.600 00:32:08.600 real 0m3.131s 00:32:08.600 user 0m2.723s 00:32:08.600 sys 0m0.372s 00:32:08.600 22:39:18 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:08.600 22:39:18 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:08.600 ************************************ 00:32:08.600 END TEST bdev_hello_world 00:32:08.600 ************************************ 00:32:08.600 22:39:18 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:08.600 22:39:18 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:08.600 22:39:18 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:08.600 22:39:18 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:08.600 22:39:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:08.600 ************************************ 00:32:08.600 START TEST bdev_bounds 00:32:08.600 ************************************ 00:32:08.600 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:08.600 22:39:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3602476 00:32:08.600 22:39:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3602476' 00:32:08.601 Process bdevio pid: 3602476 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3602476 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3602476 ']' 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:08.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:08.601 22:39:18 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:08.601 [2024-07-12 22:39:18.864102] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:08.601 [2024-07-12 22:39:18.864167] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3602476 ] 00:32:08.860 [2024-07-12 22:39:18.991211] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:08.860 [2024-07-12 22:39:19.096578] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:08.860 [2024-07-12 22:39:19.096660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:08.860 [2024-07-12 22:39:19.096665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:08.860 [2024-07-12 22:39:19.117990] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:08.860 [2024-07-12 22:39:19.126012] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:08.860 [2024-07-12 22:39:19.134031] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:09.119 [2024-07-12 22:39:19.231756] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:11.652 [2024-07-12 22:39:21.443667] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:11.652 [2024-07-12 22:39:21.443760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:11.652 [2024-07-12 22:39:21.443775] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.652 [2024-07-12 22:39:21.451686] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:11.652 [2024-07-12 22:39:21.451707] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:11.652 [2024-07-12 22:39:21.451719] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.652 [2024-07-12 22:39:21.459708] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:11.652 [2024-07-12 22:39:21.459731] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:11.652 [2024-07-12 22:39:21.459744] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.652 [2024-07-12 22:39:21.467730] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:11.652 [2024-07-12 22:39:21.467747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:11.652 [2024-07-12 22:39:21.467764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:11.652 I/O targets: 00:32:11.652 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:11.652 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:11.652 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:11.652 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:11.652 00:32:11.652 00:32:11.652 CUnit - A unit testing framework for C - Version 2.1-3 00:32:11.652 http://cunit.sourceforge.net/ 00:32:11.652 00:32:11.652 00:32:11.652 Suite: bdevio tests on: crypto_ram4 00:32:11.652 Test: blockdev write read block ...passed 00:32:11.652 Test: blockdev write zeroes read block ...passed 00:32:11.652 Test: blockdev write zeroes read no split ...passed 00:32:11.652 Test: blockdev write zeroes read split ...passed 00:32:11.652 Test: blockdev write zeroes read split partial ...passed 00:32:11.652 Test: blockdev reset ...passed 00:32:11.652 Test: blockdev write read 8 blocks ...passed 00:32:11.652 Test: blockdev write read size > 128k ...passed 00:32:11.652 Test: blockdev write read invalid size ...passed 00:32:11.652 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:11.652 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:11.652 Test: blockdev write read max offset ...passed 00:32:11.652 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:11.652 Test: blockdev writev readv 8 blocks ...passed 00:32:11.652 Test: blockdev writev readv 30 x 1block ...passed 00:32:11.652 Test: blockdev writev readv block ...passed 00:32:11.652 Test: blockdev writev readv size > 128k ...passed 00:32:11.652 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:11.652 Test: blockdev comparev and writev ...passed 00:32:11.652 Test: blockdev nvme passthru rw ...passed 00:32:11.652 Test: blockdev nvme passthru vendor specific ...passed 00:32:11.652 Test: blockdev nvme admin passthru ...passed 00:32:11.652 Test: blockdev copy ...passed 00:32:11.652 Suite: bdevio tests on: crypto_ram3 00:32:11.652 Test: blockdev write read block ...passed 00:32:11.652 Test: blockdev write zeroes read block ...passed 00:32:11.652 Test: blockdev write zeroes read no split ...passed 00:32:11.652 Test: blockdev write zeroes read split ...passed 00:32:11.652 Test: blockdev write zeroes read split partial ...passed 00:32:11.652 Test: blockdev reset ...passed 00:32:11.652 Test: blockdev write read 8 blocks ...passed 00:32:11.652 Test: blockdev write read size > 128k ...passed 00:32:11.652 Test: blockdev write read invalid size ...passed 00:32:11.652 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:11.652 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:11.652 Test: blockdev write read max offset ...passed 00:32:11.652 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:11.652 Test: blockdev writev readv 8 blocks ...passed 00:32:11.652 Test: blockdev writev readv 30 x 1block ...passed 00:32:11.652 Test: blockdev writev readv block ...passed 00:32:11.652 Test: blockdev writev readv size > 128k ...passed 00:32:11.652 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:11.652 Test: blockdev comparev and writev ...passed 00:32:11.652 Test: blockdev nvme passthru rw ...passed 00:32:11.652 Test: blockdev nvme passthru vendor specific ...passed 00:32:11.652 Test: blockdev nvme admin passthru ...passed 00:32:11.652 Test: blockdev copy ...passed 00:32:11.652 Suite: bdevio tests on: crypto_ram2 00:32:11.652 Test: blockdev write read block ...passed 00:32:11.652 Test: blockdev write zeroes read block ...passed 00:32:11.652 Test: blockdev write zeroes read no split ...passed 00:32:11.652 Test: blockdev write zeroes read split ...passed 00:32:11.652 Test: blockdev write zeroes read split partial ...passed 00:32:11.652 Test: blockdev reset ...passed 00:32:11.652 Test: blockdev write read 8 blocks ...passed 00:32:11.652 Test: blockdev write read size > 128k ...passed 00:32:11.652 Test: blockdev write read invalid size ...passed 00:32:11.652 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:11.652 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:11.652 Test: blockdev write read max offset ...passed 00:32:11.652 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:11.652 Test: blockdev writev readv 8 blocks ...passed 00:32:11.652 Test: blockdev writev readv 30 x 1block ...passed 00:32:11.652 Test: blockdev writev readv block ...passed 00:32:11.652 Test: blockdev writev readv size > 128k ...passed 00:32:11.652 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:11.652 Test: blockdev comparev and writev ...passed 00:32:11.652 Test: blockdev nvme passthru rw ...passed 00:32:11.652 Test: blockdev nvme passthru vendor specific ...passed 00:32:11.652 Test: blockdev nvme admin passthru ...passed 00:32:11.652 Test: blockdev copy ...passed 00:32:11.652 Suite: bdevio tests on: crypto_ram 00:32:11.652 Test: blockdev write read block ...passed 00:32:11.652 Test: blockdev write zeroes read block ...passed 00:32:11.652 Test: blockdev write zeroes read no split ...passed 00:32:11.652 Test: blockdev write zeroes read split ...passed 00:32:11.652 Test: blockdev write zeroes read split partial ...passed 00:32:11.652 Test: blockdev reset ...passed 00:32:11.652 Test: blockdev write read 8 blocks ...passed 00:32:11.652 Test: blockdev write read size > 128k ...passed 00:32:11.652 Test: blockdev write read invalid size ...passed 00:32:11.652 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:11.652 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:11.652 Test: blockdev write read max offset ...passed 00:32:11.652 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:11.652 Test: blockdev writev readv 8 blocks ...passed 00:32:11.652 Test: blockdev writev readv 30 x 1block ...passed 00:32:11.652 Test: blockdev writev readv block ...passed 00:32:11.652 Test: blockdev writev readv size > 128k ...passed 00:32:11.652 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:11.652 Test: blockdev comparev and writev ...passed 00:32:11.652 Test: blockdev nvme passthru rw ...passed 00:32:11.652 Test: blockdev nvme passthru vendor specific ...passed 00:32:11.652 Test: blockdev nvme admin passthru ...passed 00:32:11.652 Test: blockdev copy ...passed 00:32:11.652 00:32:11.652 Run Summary: Type Total Ran Passed Failed Inactive 00:32:11.652 suites 4 4 n/a 0 0 00:32:11.652 tests 92 92 92 0 0 00:32:11.652 asserts 520 520 520 0 n/a 00:32:11.652 00:32:11.652 Elapsed time = 0.538 seconds 00:32:11.652 0 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3602476 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3602476 ']' 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3602476 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:11.652 22:39:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3602476 00:32:11.911 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:11.911 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:11.911 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3602476' 00:32:11.911 killing process with pid 3602476 00:32:11.911 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3602476 00:32:11.911 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3602476 00:32:12.170 22:39:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:12.170 00:32:12.170 real 0m3.645s 00:32:12.170 user 0m10.160s 00:32:12.170 sys 0m0.575s 00:32:12.170 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:12.170 22:39:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:12.170 ************************************ 00:32:12.170 END TEST bdev_bounds 00:32:12.170 ************************************ 00:32:12.170 22:39:22 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:12.170 22:39:22 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:12.170 22:39:22 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:12.170 22:39:22 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:12.170 22:39:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:12.429 ************************************ 00:32:12.429 START TEST bdev_nbd 00:32:12.429 ************************************ 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3603023 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3603023 /var/tmp/spdk-nbd.sock 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3603023 ']' 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:12.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:12.429 22:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:12.429 [2024-07-12 22:39:22.609830] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:12.429 [2024-07-12 22:39:22.609908] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:12.429 [2024-07-12 22:39:22.741278] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:12.687 [2024-07-12 22:39:22.846556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:12.687 [2024-07-12 22:39:22.867880] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:12.687 [2024-07-12 22:39:22.875917] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:12.687 [2024-07-12 22:39:22.883920] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:12.687 [2024-07-12 22:39:22.992059] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:15.222 [2024-07-12 22:39:25.230410] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:15.222 [2024-07-12 22:39:25.230478] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:15.222 [2024-07-12 22:39:25.230494] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.222 [2024-07-12 22:39:25.238432] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:15.222 [2024-07-12 22:39:25.238451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:15.222 [2024-07-12 22:39:25.238463] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.222 [2024-07-12 22:39:25.246451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:15.222 [2024-07-12 22:39:25.246468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:15.222 [2024-07-12 22:39:25.246479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.222 [2024-07-12 22:39:25.254471] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:15.222 [2024-07-12 22:39:25.254488] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:15.222 [2024-07-12 22:39:25.254499] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:15.222 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:15.481 1+0 records in 00:32:15.481 1+0 records out 00:32:15.481 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297046 s, 13.8 MB/s 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:15.481 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:15.741 1+0 records in 00:32:15.741 1+0 records out 00:32:15.741 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322362 s, 12.7 MB/s 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:15.741 22:39:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:16.010 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:16.011 1+0 records in 00:32:16.011 1+0 records out 00:32:16.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379173 s, 10.8 MB/s 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:16.011 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:16.338 1+0 records in 00:32:16.338 1+0 records out 00:32:16.338 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333377 s, 12.3 MB/s 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:16.338 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:16.597 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd0", 00:32:16.597 "bdev_name": "crypto_ram" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd1", 00:32:16.597 "bdev_name": "crypto_ram2" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd2", 00:32:16.597 "bdev_name": "crypto_ram3" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd3", 00:32:16.597 "bdev_name": "crypto_ram4" 00:32:16.597 } 00:32:16.597 ]' 00:32:16.597 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:16.597 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd0", 00:32:16.597 "bdev_name": "crypto_ram" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd1", 00:32:16.597 "bdev_name": "crypto_ram2" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd2", 00:32:16.597 "bdev_name": "crypto_ram3" 00:32:16.597 }, 00:32:16.597 { 00:32:16.597 "nbd_device": "/dev/nbd3", 00:32:16.597 "bdev_name": "crypto_ram4" 00:32:16.597 } 00:32:16.597 ]' 00:32:16.597 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:16.598 22:39:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:16.856 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:17.118 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:17.377 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:17.636 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:17.896 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:17.896 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:17.896 22:39:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:17.896 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:18.155 /dev/nbd0 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.155 1+0 records in 00:32:18.155 1+0 records out 00:32:18.155 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298108 s, 13.7 MB/s 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:18.155 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:18.155 /dev/nbd1 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.414 1+0 records in 00:32:18.414 1+0 records out 00:32:18.414 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280389 s, 14.6 MB/s 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:18.414 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:18.673 /dev/nbd10 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.673 1+0 records in 00:32:18.673 1+0 records out 00:32:18.673 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345536 s, 11.9 MB/s 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:18.673 /dev/nbd11 00:32:18.673 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:18.932 22:39:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:18.932 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:18.932 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:18.932 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:18.932 22:39:28 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.932 1+0 records in 00:32:18.932 1+0 records out 00:32:18.932 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000349416 s, 11.7 MB/s 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:18.932 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:19.192 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:19.192 { 00:32:19.192 "nbd_device": "/dev/nbd0", 00:32:19.192 "bdev_name": "crypto_ram" 00:32:19.192 }, 00:32:19.192 { 00:32:19.192 "nbd_device": "/dev/nbd1", 00:32:19.192 "bdev_name": "crypto_ram2" 00:32:19.192 }, 00:32:19.192 { 00:32:19.192 "nbd_device": "/dev/nbd10", 00:32:19.192 "bdev_name": "crypto_ram3" 00:32:19.192 }, 00:32:19.192 { 00:32:19.192 "nbd_device": "/dev/nbd11", 00:32:19.192 "bdev_name": "crypto_ram4" 00:32:19.193 } 00:32:19.193 ]' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:19.193 { 00:32:19.193 "nbd_device": "/dev/nbd0", 00:32:19.193 "bdev_name": "crypto_ram" 00:32:19.193 }, 00:32:19.193 { 00:32:19.193 "nbd_device": "/dev/nbd1", 00:32:19.193 "bdev_name": "crypto_ram2" 00:32:19.193 }, 00:32:19.193 { 00:32:19.193 "nbd_device": "/dev/nbd10", 00:32:19.193 "bdev_name": "crypto_ram3" 00:32:19.193 }, 00:32:19.193 { 00:32:19.193 "nbd_device": "/dev/nbd11", 00:32:19.193 "bdev_name": "crypto_ram4" 00:32:19.193 } 00:32:19.193 ]' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:19.193 /dev/nbd1 00:32:19.193 /dev/nbd10 00:32:19.193 /dev/nbd11' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:19.193 /dev/nbd1 00:32:19.193 /dev/nbd10 00:32:19.193 /dev/nbd11' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:19.193 256+0 records in 00:32:19.193 256+0 records out 00:32:19.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011498 s, 91.2 MB/s 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:19.193 256+0 records in 00:32:19.193 256+0 records out 00:32:19.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0621337 s, 16.9 MB/s 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:19.193 256+0 records in 00:32:19.193 256+0 records out 00:32:19.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0656272 s, 16.0 MB/s 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:19.193 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:19.452 256+0 records in 00:32:19.452 256+0 records out 00:32:19.452 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0435418 s, 24.1 MB/s 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:19.452 256+0 records in 00:32:19.452 256+0 records out 00:32:19.452 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0574833 s, 18.2 MB/s 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:19.452 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.453 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.712 22:39:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.712 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:19.970 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.970 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.970 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.970 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:20.228 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:20.486 22:39:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:20.745 malloc_lvol_verify 00:32:20.745 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:21.003 35e9ac8e-bb86-4bf6-ae5d-98c2eff8cc3f 00:32:21.003 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:21.262 a0b15f16-44b9-4f44-b4af-6ac0e9600bf8 00:32:21.262 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:21.521 /dev/nbd0 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:21.521 mke2fs 1.46.5 (30-Dec-2021) 00:32:21.521 Discarding device blocks: 0/4096 done 00:32:21.521 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:21.521 00:32:21.521 Allocating group tables: 0/1 done 00:32:21.521 Writing inode tables: 0/1 done 00:32:21.521 Creating journal (1024 blocks): done 00:32:21.521 Writing superblocks and filesystem accounting information: 0/1 done 00:32:21.521 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:21.521 22:39:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:21.779 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3603023 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3603023 ']' 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3603023 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3603023 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3603023' 00:32:22.038 killing process with pid 3603023 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3603023 00:32:22.038 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3603023 00:32:22.298 22:39:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:22.298 00:32:22.298 real 0m10.045s 00:32:22.298 user 0m12.849s 00:32:22.298 sys 0m4.031s 00:32:22.298 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:22.298 22:39:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:22.298 ************************************ 00:32:22.298 END TEST bdev_nbd 00:32:22.298 ************************************ 00:32:22.558 22:39:32 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:22.558 22:39:32 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:22.558 22:39:32 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:22.558 22:39:32 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:22.558 22:39:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:22.558 ************************************ 00:32:22.558 START TEST bdev_fio 00:32:22.558 ************************************ 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:22.558 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:22.558 ************************************ 00:32:22.558 START TEST bdev_fio_rw_verify 00:32:22.558 ************************************ 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:22.558 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:22.559 22:39:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:22.816 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:22.816 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:22.816 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:22.816 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:22.816 fio-3.35 00:32:22.816 Starting 4 threads 00:32:37.699 00:32:37.699 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3604959: Fri Jul 12 22:39:45 2024 00:32:37.699 read: IOPS=21.2k, BW=82.7MiB/s (86.8MB/s)(827MiB/10001msec) 00:32:37.699 slat (usec): min=10, max=463, avg=63.76, stdev=40.76 00:32:37.699 clat (usec): min=10, max=1427, avg=336.86, stdev=253.16 00:32:37.699 lat (usec): min=55, max=1582, avg=400.62, stdev=281.23 00:32:37.699 clat percentiles (usec): 00:32:37.699 | 50.000th=[ 265], 99.000th=[ 1188], 99.900th=[ 1336], 99.990th=[ 1385], 00:32:37.699 | 99.999th=[ 1418] 00:32:37.699 write: IOPS=23.4k, BW=91.4MiB/s (95.8MB/s)(892MiB/9760msec); 0 zone resets 00:32:37.699 slat (usec): min=18, max=402, avg=76.93, stdev=41.41 00:32:37.699 clat (usec): min=22, max=1853, avg=410.67, stdev=296.11 00:32:37.699 lat (usec): min=58, max=2051, avg=487.61, stdev=324.59 00:32:37.699 clat percentiles (usec): 00:32:37.699 | 50.000th=[ 338], 99.000th=[ 1500], 99.900th=[ 1680], 99.990th=[ 1745], 00:32:37.699 | 99.999th=[ 1795] 00:32:37.699 bw ( KiB/s): min=71096, max=126912, per=97.52%, avg=91264.00, stdev=3259.80, samples=76 00:32:37.699 iops : min=17774, max=31728, avg=22816.00, stdev=814.95, samples=76 00:32:37.699 lat (usec) : 20=0.01%, 50=0.15%, 100=8.07%, 250=31.59%, 500=37.88% 00:32:37.699 lat (usec) : 750=11.48%, 1000=6.02% 00:32:37.699 lat (msec) : 2=4.80% 00:32:37.699 cpu : usr=99.61%, sys=0.00%, ctx=63, majf=0, minf=238 00:32:37.699 IO depths : 1=10.5%, 2=25.5%, 4=51.0%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:37.699 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:37.699 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:37.699 issued rwts: total=211833,228342,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:37.699 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:37.699 00:32:37.699 Run status group 0 (all jobs): 00:32:37.699 READ: bw=82.7MiB/s (86.8MB/s), 82.7MiB/s-82.7MiB/s (86.8MB/s-86.8MB/s), io=827MiB (868MB), run=10001-10001msec 00:32:37.699 WRITE: bw=91.4MiB/s (95.8MB/s), 91.4MiB/s-91.4MiB/s (95.8MB/s-95.8MB/s), io=892MiB (935MB), run=9760-9760msec 00:32:37.699 00:32:37.699 real 0m13.463s 00:32:37.699 user 0m45.840s 00:32:37.699 sys 0m0.464s 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:37.699 ************************************ 00:32:37.699 END TEST bdev_fio_rw_verify 00:32:37.699 ************************************ 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:37.699 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da2083d8-576c-5c7f-8cdc-629db3e3d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da2083d8-576c-5c7f-8cdc-629db3e3d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "98c7760f-ce2e-5499-b939-7314efcee72b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "98c7760f-ce2e-5499-b939-7314efcee72b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "34d7d990-3224-5150-9de9-9ce48a6a2a1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "34d7d990-3224-5150-9de9-9ce48a6a2a1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ef28ce93-2293-545d-9119-e876b42f45e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ef28ce93-2293-545d-9119-e876b42f45e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:37.700 crypto_ram2 00:32:37.700 crypto_ram3 00:32:37.700 crypto_ram4 ]] 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da2083d8-576c-5c7f-8cdc-629db3e3d758"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da2083d8-576c-5c7f-8cdc-629db3e3d758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "98c7760f-ce2e-5499-b939-7314efcee72b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "98c7760f-ce2e-5499-b939-7314efcee72b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "34d7d990-3224-5150-9de9-9ce48a6a2a1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "34d7d990-3224-5150-9de9-9ce48a6a2a1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ef28ce93-2293-545d-9119-e876b42f45e4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ef28ce93-2293-545d-9119-e876b42f45e4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:37.700 ************************************ 00:32:37.700 START TEST bdev_fio_trim 00:32:37.700 ************************************ 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:37.700 22:39:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:37.700 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:37.700 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:37.700 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:37.700 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:37.700 fio-3.35 00:32:37.700 Starting 4 threads 00:32:49.915 00:32:49.915 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3606833: Fri Jul 12 22:39:59 2024 00:32:49.915 write: IOPS=33.2k, BW=130MiB/s (136MB/s)(1297MiB/10001msec); 0 zone resets 00:32:49.915 slat (usec): min=11, max=437, avg=69.44, stdev=51.01 00:32:49.915 clat (usec): min=36, max=1773, avg=303.73, stdev=258.76 00:32:49.915 lat (usec): min=59, max=2000, avg=373.17, stdev=299.62 00:32:49.915 clat percentiles (usec): 00:32:49.915 | 50.000th=[ 223], 99.000th=[ 1434], 99.900th=[ 1598], 99.990th=[ 1680], 00:32:49.915 | 99.999th=[ 1762] 00:32:49.915 bw ( KiB/s): min=109256, max=156880, per=100.00%, avg=132923.37, stdev=3319.69, samples=76 00:32:49.915 iops : min=27314, max=39220, avg=33230.84, stdev=829.92, samples=76 00:32:49.915 trim: IOPS=33.2k, BW=130MiB/s (136MB/s)(1297MiB/10001msec); 0 zone resets 00:32:49.915 slat (nsec): min=4536, max=71705, avg=18988.48, stdev=9072.50 00:32:49.915 clat (usec): min=55, max=1707, avg=287.11, stdev=177.68 00:32:49.915 lat (usec): min=62, max=1719, avg=306.10, stdev=182.83 00:32:49.915 clat percentiles (usec): 00:32:49.915 | 50.000th=[ 241], 99.000th=[ 988], 99.900th=[ 1074], 99.990th=[ 1139], 00:32:49.915 | 99.999th=[ 1549] 00:32:49.915 bw ( KiB/s): min=109248, max=156912, per=100.00%, avg=132925.05, stdev=3320.04, samples=76 00:32:49.915 iops : min=27312, max=39228, avg=33231.26, stdev=830.01, samples=76 00:32:49.915 lat (usec) : 50=0.90%, 100=8.38%, 250=46.00%, 500=32.74%, 750=7.45% 00:32:49.915 lat (usec) : 1000=2.49% 00:32:49.915 lat (msec) : 2=2.06% 00:32:49.915 cpu : usr=99.54%, sys=0.00%, ctx=51, majf=0, minf=107 00:32:49.915 IO depths : 1=8.5%, 2=26.2%, 4=52.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:49.915 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:49.915 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:49.915 issued rwts: total=0,332136,332136,0 short=0,0,0,0 dropped=0,0,0,0 00:32:49.915 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:49.915 00:32:49.915 Run status group 0 (all jobs): 00:32:49.915 WRITE: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=1297MiB (1360MB), run=10001-10001msec 00:32:49.915 TRIM: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=1297MiB (1360MB), run=10001-10001msec 00:32:49.915 00:32:49.915 real 0m13.599s 00:32:49.915 user 0m45.957s 00:32:49.915 sys 0m0.527s 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:49.915 ************************************ 00:32:49.915 END TEST bdev_fio_trim 00:32:49.915 ************************************ 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:49.915 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:49.915 00:32:49.915 real 0m27.407s 00:32:49.915 user 1m31.981s 00:32:49.915 sys 0m1.178s 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:49.915 ************************************ 00:32:49.915 END TEST bdev_fio 00:32:49.915 ************************************ 00:32:49.915 22:40:00 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:49.915 22:40:00 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:49.915 22:40:00 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:49.915 22:40:00 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:49.915 22:40:00 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:49.915 22:40:00 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:49.915 ************************************ 00:32:49.915 START TEST bdev_verify 00:32:49.915 ************************************ 00:32:49.915 22:40:00 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:49.915 [2024-07-12 22:40:00.198481] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:49.915 [2024-07-12 22:40:00.198532] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3608174 ] 00:32:50.175 [2024-07-12 22:40:00.310212] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:50.175 [2024-07-12 22:40:00.418145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:50.175 [2024-07-12 22:40:00.418151] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:50.175 [2024-07-12 22:40:00.439560] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:50.175 [2024-07-12 22:40:00.447589] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:50.175 [2024-07-12 22:40:00.455616] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:50.434 [2024-07-12 22:40:00.579492] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:52.970 [2024-07-12 22:40:02.816991] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:52.970 [2024-07-12 22:40:02.817088] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:52.970 [2024-07-12 22:40:02.817105] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.970 [2024-07-12 22:40:02.825008] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:52.970 [2024-07-12 22:40:02.825028] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:52.970 [2024-07-12 22:40:02.825040] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.970 [2024-07-12 22:40:02.833032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:52.970 [2024-07-12 22:40:02.833049] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:52.971 [2024-07-12 22:40:02.833060] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.971 [2024-07-12 22:40:02.841054] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:52.971 [2024-07-12 22:40:02.841071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:52.971 [2024-07-12 22:40:02.841082] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.971 Running I/O for 5 seconds... 00:32:58.303 00:32:58.303 Latency(us) 00:32:58.303 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:58.303 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x0 length 0x1000 00:32:58.303 crypto_ram : 5.06 480.53 1.88 0.00 0.00 265467.07 13335.15 152271.47 00:32:58.303 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x1000 length 0x1000 00:32:58.303 crypto_ram : 5.07 479.70 1.87 0.00 0.00 266055.87 15044.79 185096.46 00:32:58.303 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x0 length 0x1000 00:32:58.303 crypto_ram2 : 5.06 480.23 1.88 0.00 0.00 264746.35 13050.21 149536.06 00:32:58.303 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x1000 length 0x1000 00:32:58.303 crypto_ram2 : 5.07 479.61 1.87 0.00 0.00 265230.81 15044.79 169595.77 00:32:58.303 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x0 length 0x1000 00:32:58.303 crypto_ram3 : 5.05 3777.55 14.76 0.00 0.00 33562.59 7351.43 26670.30 00:32:58.303 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x1000 length 0x1000 00:32:58.303 crypto_ram3 : 5.06 3767.28 14.72 0.00 0.00 33655.04 3305.29 26556.33 00:32:58.303 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x0 length 0x1000 00:32:58.303 crypto_ram4 : 5.06 3795.15 14.82 0.00 0.00 33375.59 2080.06 25986.45 00:32:58.303 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:58.303 Verification LBA range: start 0x1000 length 0x1000 00:32:58.303 crypto_ram4 : 5.06 3766.84 14.71 0.00 0.00 33570.10 3376.53 26556.33 00:32:58.303 =================================================================================================================== 00:32:58.303 Total : 17026.90 66.51 0.00 0.00 59725.23 2080.06 185096.46 00:32:58.303 00:32:58.303 real 0m8.274s 00:32:58.303 user 0m15.660s 00:32:58.303 sys 0m0.400s 00:32:58.303 22:40:08 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:58.303 22:40:08 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:58.303 ************************************ 00:32:58.303 END TEST bdev_verify 00:32:58.303 ************************************ 00:32:58.303 22:40:08 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:58.303 22:40:08 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:58.303 22:40:08 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:58.303 22:40:08 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:58.303 22:40:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:58.303 ************************************ 00:32:58.303 START TEST bdev_verify_big_io 00:32:58.303 ************************************ 00:32:58.304 22:40:08 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:58.304 [2024-07-12 22:40:08.570615] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:32:58.304 [2024-07-12 22:40:08.570685] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3609232 ] 00:32:58.562 [2024-07-12 22:40:08.701228] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:58.562 [2024-07-12 22:40:08.806851] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:58.562 [2024-07-12 22:40:08.806856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:58.562 [2024-07-12 22:40:08.828244] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:58.562 [2024-07-12 22:40:08.836272] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:58.562 [2024-07-12 22:40:08.844301] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:58.821 [2024-07-12 22:40:08.950778] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:01.354 [2024-07-12 22:40:11.174024] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:01.354 [2024-07-12 22:40:11.174125] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:01.354 [2024-07-12 22:40:11.174140] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.354 [2024-07-12 22:40:11.182041] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:01.354 [2024-07-12 22:40:11.182061] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:01.354 [2024-07-12 22:40:11.182073] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.354 [2024-07-12 22:40:11.190062] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:01.354 [2024-07-12 22:40:11.190079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:01.354 [2024-07-12 22:40:11.190091] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.354 [2024-07-12 22:40:11.198085] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:01.354 [2024-07-12 22:40:11.198102] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:01.354 [2024-07-12 22:40:11.198113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:01.354 Running I/O for 5 seconds... 00:33:03.892 [2024-07-12 22:40:13.838870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.839355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.839784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.841128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.842780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.844101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.845644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.847170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.847919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.848324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.849748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.851046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.853730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.855264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.856936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.858504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.859391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.860283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.861575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.863128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.865537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.867087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.868634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.869270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.870180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.871684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.873269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.874814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.877594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.879164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.880056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.880454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.882503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.883947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.885522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.887187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.889947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.891333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.891719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.892108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.893733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.895293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.896843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.897508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.900199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.900678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.901072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.901657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.903579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.905138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.906385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.907495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.909732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.910133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.910531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.912100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.914048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.915614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.916195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.917486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.918963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.919362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.919410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.920561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.922518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.924081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.924136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.924559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.925694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.927293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.927339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.927722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.928234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.929537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.929585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.930924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.932127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.933640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.933690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.935239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.935636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.936821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.936882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.937277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.938653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.940210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.940262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.941852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.942398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.943703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.943751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.945293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.946537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.946945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.946998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.948474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.948878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.950459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.950508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.951446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.952960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.953016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.953401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.953444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.955315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.955372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.955973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.956030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.957574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.957631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.958028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.958079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.960046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.960101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.960677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.960724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.962307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.962362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.963250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.963298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.964584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.964640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.965893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.965946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.967840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.967905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.969552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.893 [2024-07-12 22:40:13.969609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.970360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.970419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.971839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.971886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.974070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.974130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.975320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.975369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.977223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.977289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.978720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.978767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.981786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.981852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.983344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.983392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.985162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.985221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.986550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.986598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.989315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.989373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.990428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.990472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.992270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.992327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.992832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.992876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.994675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.994734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.995130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.995175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.996062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.996121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.996513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.996561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.998685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.998744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.999137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:13.999196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.000147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.000204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.000594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.000641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.002565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.002623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.003015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.003059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.003999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.004058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.004446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.004493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.006183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.006246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.006630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.006673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.007508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.007566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.007964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.008014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.009633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.009703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.010104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.010150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.010947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.011011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.011401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.011469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.013347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.013412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.013798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.013845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.014730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.014790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.015201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.015250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.017795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.017853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.018252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.018313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.019243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.019309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.019694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.019743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.021622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.021679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.022077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.022127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.023084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.023141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.023535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.023587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.025279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.025338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.025728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.025775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.026654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.026710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.027109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.027160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.028760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.028819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.029216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.029266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.030163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.030219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.030606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.030655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.032317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.032382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.032774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.032830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.032847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.033232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.033729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.033784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.034180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.034230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.034248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.034546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.035538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.035947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.036004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.036392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.036780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.036945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.037339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.037383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.037765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.038106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.039083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.039135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.039177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.039218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.894 [2024-07-12 22:40:14.039525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.039683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.039728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.039789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.039836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.040102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.041882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.042234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.043942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.044202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.045967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.046279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.172802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.172879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.173800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.173847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.175573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.175632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.177179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.177224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.180485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.180547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.181946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.181993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.183793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.183856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.185251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.185298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.187388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.187447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.187829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.187872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.189570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.189628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.190945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.190992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.193342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.193400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.194709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.196259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.197083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.197138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.197908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.199207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.201074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.201132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.202422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.204286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.204343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.205372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:03.895 [2024-07-12 22:40:14.206741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.208162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.209485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.211033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.211521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.212834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:03.895 [2024-07-12 22:40:14.214175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.215734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.218657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.220135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.221544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.223128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.224879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.226211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.227759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.228869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.231695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.232972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.234509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.235149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.237106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.238722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.240147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.240542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.243170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.244735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.245555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.247134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.248878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.250481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.250888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.251289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.253984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.255096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.255152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.256680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.258551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.260233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.260302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.260695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.262069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.263399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.263454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.265009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.265503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.266802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.266857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.268171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.269357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.269777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.269829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.271335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.271748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.273334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.273389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.274581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.275751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.277322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.277377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.277943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.278446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.279350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.279406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.280712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.157 [2024-07-12 22:40:14.281896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.283430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.283493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.285167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.285582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.287167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.287223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.287613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.288974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.290285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.290339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.291880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.292381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.293705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.293760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.295083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.296282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.296701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.296768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.298238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.298653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.300312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.300377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.301464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.302659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.303074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.303127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.303515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.303997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.305212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.305270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.305663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.306901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.307320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.307373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.308055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.308478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.309741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.309797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.311035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.312349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.312756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.312819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.314439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.314850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.315286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.315344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.316545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.317757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.318673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.318732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.319944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.321491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.321555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.322737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.323361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.325999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.326064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.326769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.327872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.328305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.328707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.329120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.329174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.330418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.331905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.333012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.333068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.333908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.334323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.334380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.335686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.338204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.338839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.338901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.339297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.340160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.340232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.340633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.341037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.342720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.342798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.343201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.343598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.344088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.344497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.344890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.344961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.346323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.346733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.347143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.347199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.348036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.348436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.348495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.348894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.350550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.350974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.351038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.158 [2024-07-12 22:40:14.351427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.352307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.352373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.352771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.353176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.354989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.355055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.355447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.355842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.356367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.356778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.357184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.357239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.358711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.359128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.359525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.359589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.360493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.360894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.360959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.361354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.363345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.363756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.363813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.364222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.365149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.365225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.365631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.366040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.368200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.368268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.368660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.369075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.369583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:04.159 [2024-07-12 22:40:14.370052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.370111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.370497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.371944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.371973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.372358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.372401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.372783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.373150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.373651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.373706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.374112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.374501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.376213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.376269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.376659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.377062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.377471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.377625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.378029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.378074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.378462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.379711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.380123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.380175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.380565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.380974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.381129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.381522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.381566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.381962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.383255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.384801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.384857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.386384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.386781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.386944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.388222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.388272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.389592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.390858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.392056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.392107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.392996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.393269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.393425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.394665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.394715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.395279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.396720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.397287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.397336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.397377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.397689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.397842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.398894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.398951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.398993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.400157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.400562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.400610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.401001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.401268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.401426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.403022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.403074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.159 [2024-07-12 22:40:14.403953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.405339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.405391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.405433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.405476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.405973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.406127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.406179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.406221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.406261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.407518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.407570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.407611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.407651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.407972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.408125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.408171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.408212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.408256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.409681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.409733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.409774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.409815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.410090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.410240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.410287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.410329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.410370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.411543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.411596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.411637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.412112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.412163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.412219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.414587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.414668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.415209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.417208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.417290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.417878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.420676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.420758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.421449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.422594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.422674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.423955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.426038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.426120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.427350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.428787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.428865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.429984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.432785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.432865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.434217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.436348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.436431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.436823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.439585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.439676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.440616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.442411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.442490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.444026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.446697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.446777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.447384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.449080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.449161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.450684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.453167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.453247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.454771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.456355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.456432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.457634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.460119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.460199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.461720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.463751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.463843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.465293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.467328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.467408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.468897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.471043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.471130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.472758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.473990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.475267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.475313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.476609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.477017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.477076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.477979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.478027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.478067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.478333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.160 [2024-07-12 22:40:14.479325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.480114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.480165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.480206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.481945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.482003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.482045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.483573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.483934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.486101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.486159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.486212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.487728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.488204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.488255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.488666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.488711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.488986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.490182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.490240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.491896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.491951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.492358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.493771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.493819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.493860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.494136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.495097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.496654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.496704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.496745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.498200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.498255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.498298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.499792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.500238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.504557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.504621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.504664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.505977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.506394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.506448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.508014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.508060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.508361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.509373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.509427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.510732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.510777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.511240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.512562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.512612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.512653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.513011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.513849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.515403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.515458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.515499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.517346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.517408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.517452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.518779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.519052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.522524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.522581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.522622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.523974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.524386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.524444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.525362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.525409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.525743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.529322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.529375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.530036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.530082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.530484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.531365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.531412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.531454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.531768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.535261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.536665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.536713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.536761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.422 [2024-07-12 22:40:14.538695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.538757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.538800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.539196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.539662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.543984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.544045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.544087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.545397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.547358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.547421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.548785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.548830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.549271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.553355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.553412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.554032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.554078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.554483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.555830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.555879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.555923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.556193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.559037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.559452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.559499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.559548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.561529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.561604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.563158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.563203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.563469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.567806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.567865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.569332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.569375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.571085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.571139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.571937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.571982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.572286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.577281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.577338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.578658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.578703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.579915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.579976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.581029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.581076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.581350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.585906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.585968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.587019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.587063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.588768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.588824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.590312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.590367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.590632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.594109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.594166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.595440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.596753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.598110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.598163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.599719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.601379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.601651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.606539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.606594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.607625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.609296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.609351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.610895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.611109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.614181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.615266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.616856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.617348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.617756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.618304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.619593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.620980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.621251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.626602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.627904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.629434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.629829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.630589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.632244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.633796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.635263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.635534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.640202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.641361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.642035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.643458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.645272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.646569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.647880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.423 [2024-07-12 22:40:14.649234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.649575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.651753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.653313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.654199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.655291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.656659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.657592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.658882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.660214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.660490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.662619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.662675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.663987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.664033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.665259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.665315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.666204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.666250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.666522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.670998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.671056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.671920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.671972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.673900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.673967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.675580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.675626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.675895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.680416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.680474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.681839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.681885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.683845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.683900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.684686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.684732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.685049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.689291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.689346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.690940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.690986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.693011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.693076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.694663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.694729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.695057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.698647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.698705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.699449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.699496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.700789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.700846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.701751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.701801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.702076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.707164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.707223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.707768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.707814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.709804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.709872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.710791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.710836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.711204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.715336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.715394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.716789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.716835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.718151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.718210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.719765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.719826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.720109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.723433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.723493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.725142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.725187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.726512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.726569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.728084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.728130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.728469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.734092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.734150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.735383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.735431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.737510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.737573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.739226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.739279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.739587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.743676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.744227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.424 [2024-07-12 22:40:14.744276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.745369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.746608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.747898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.747953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.748341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.748617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.751565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.752542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.752594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.754071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.755960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.756017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.756738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.757824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.758221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.764094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.764150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.764535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.764935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.765474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.766789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.767548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.767595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.767901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.769712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.770409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.771547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.771591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.772370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.772879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.772937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.774090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.774417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.778600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.779056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.779107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.780313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.781135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.781200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.781642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.783038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.783409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.787753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.787819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.788300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.789651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.790174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.790584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.790988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.791038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.791307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.793099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.793507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.795150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.795197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.796985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.797380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.797437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.797824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.798189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.800906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.801310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.801360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.801753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.686 [2024-07-12 22:40:14.803957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.804021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.804607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.805849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.806183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.810795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.810855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.811254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.811648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.812157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.813823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.814276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.814324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.814594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.816365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.816767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.818255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.818301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.819145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.819545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.819597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.821039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.821440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.825866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.826277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.826326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.827636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.828474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.828539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.828938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.830410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.830825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.835260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.835328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.835837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.837147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.837634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.837694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.838121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.838177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.838537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.842102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.842172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.842568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.842616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.844143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.845151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.845202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.845847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.846167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.850204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.850605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.850654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.851055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.852923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.852986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.853865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.853910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.854253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.856446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.856502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.857624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.857669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.859466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.859527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.859922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.859990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.860261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.863316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.863375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.864561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.864610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.865898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.865962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.867156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.867205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.867595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.873578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.873655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.875193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.875248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.877043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.877101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.878409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.878452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.878724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.881829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.881886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.882718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.882778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.884310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.884367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.885164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.885211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.885511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.887690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.887745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.889040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.889084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.687 [2024-07-12 22:40:14.889543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.889591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.890798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.890847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.891207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.896427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.896485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.896534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.896588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.898686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.898748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.898791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.898833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.899105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.901874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.902148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.904963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.905909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.909777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.909830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.909871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.909911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.910354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.910401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.910454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.910496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.910766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.913107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.914051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.914100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.915072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.915486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.917169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.917217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.918655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.918974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.921852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.923200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.923248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.924211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.924709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.926051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.926100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.926801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.927080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.928861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.930332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.930378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.931921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.932331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.933796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.933846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.935288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.935609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.939608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.940513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.940563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.941762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.942213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.943540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.943588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.945134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.945533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.949255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.950318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.950367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.950955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.951364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.951760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.951806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.953400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.953673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.956505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.957891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.957951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.959493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.959905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.961594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.961650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.962227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.962499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.965355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.966942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.966991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.968166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.968624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.688 [2024-07-12 22:40:14.969948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.969996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.971531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.971957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.975700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.977010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.977072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.978384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.978795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.979650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.979702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.981132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.981405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.984464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.985529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.985578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.986168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.986582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.987887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.987944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.989295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.989568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.992705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.992763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.994094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.994139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.994548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.994606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.995105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.995152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:14.995421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.000020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.000080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.001189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.001235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.001721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.003051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.003100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.003141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.003407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.005779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.006729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.006778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.006819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.008581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.008637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.689 [2024-07-12 22:40:15.008678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.010233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.010588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.015246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.015303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.015344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.015870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.016287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.016344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.016733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.016779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.017057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.021238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.021291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.022621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.022667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.023128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.024684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.024733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.024774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.025201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.028390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.029697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.029744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.029785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.031750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.031805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.031847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.032516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.032790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.037713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.037770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.037812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.039434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.040033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.040088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.041469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.041514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.041782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.045046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.045098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.046406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.046451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.046857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.047645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.047694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.047735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.048045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.050557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.051899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.051952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.051993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.053145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.053204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.053246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.054539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.054809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.059393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.059450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.059492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.059886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.060298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.060354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.062006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.062058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.062329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.066006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.066058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.067612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.067659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.068134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.069604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.069649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.069691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.070004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.073278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.074835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.074883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.074923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.076905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.078442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.078497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.080130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.080414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.083896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.085205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.085255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.086575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.086995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.087045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.087648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.087693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.087971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.092106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.092158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.092643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.092692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.093110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.093507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.093553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.094898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.095174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.098157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.099494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.099542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.101091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.101585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.103099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.103145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.103864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.104140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.107177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.108597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.108644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.109960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.110408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.111723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.111772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.113314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.113667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.117091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.118407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.118454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.119777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.120187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.120956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.121007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.122368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.122637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.125764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.126909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.126960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.127471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.127881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.129192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.129239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.130604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.130874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.135564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.136607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.136655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.137231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.138125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.139507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.139555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.140984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.141306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.145448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.147107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.148633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.149848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.150370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.151894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.152296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.153892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.154190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.158704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.160281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.161960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.163085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.164732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.165311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.166615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.167915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.168234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.173506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.175074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.175880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.176896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.178191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.179146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.180446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.181646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.181919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.186472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.186878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.188566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.188967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.190584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.191299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.192779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.951 [2024-07-12 22:40:15.194080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.194441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.199806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.199879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.201423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.201474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.203088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.203146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.204208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.204270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.204541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.207431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.207490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.208062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.208113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.209882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.209949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.210968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.211015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.211384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.215524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.215582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.216482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.216531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.218060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.218130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.219650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.219696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.220072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.224214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.224273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.225328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.225375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.226846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.226903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.227403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.227458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.227728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.231320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.231377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.232536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.232584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.234438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.234493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.235370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.235416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.235756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.238008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.238067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.239096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.239140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.240314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.240368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.241612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.241658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.242054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.247403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.247461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.248171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.248216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.249262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.249320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.250230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.250277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.250582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.254746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.254807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.256030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.256077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.258110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.258174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.258561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.258618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.258980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.261254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.261311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.261700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.261747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.263539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.263597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.263994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.264043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.264315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.266769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.268162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.268213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.268614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.269477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.270715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.270771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.271177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.271460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.952 [2024-07-12 22:40:15.273382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.274274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.274326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.275688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.276609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.276671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.277068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.278205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.278560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.281662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.281720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.282760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.283738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.284209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.284607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.285008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.285060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.285424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.288854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.289285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.289681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.289730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.290551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.290962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.291021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.291407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.291892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.294138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.294540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.294598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.294998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.295870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.295933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.296323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.296715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.297122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.299286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.299344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.299731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.300129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.300659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.301068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.301463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.214 [2024-07-12 22:40:15.301510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.301785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.303701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.304129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.304525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.304574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.305480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.305877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.305933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.306323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.306701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.308788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.309199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.309257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.309642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.310484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.310542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.310957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.311350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.311704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.313827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.313886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.314279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.314793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.315213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.316836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.317586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.317634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.317955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.320893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.321855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.323106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.323153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.324501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.325490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.325554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.325946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.326352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.330862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.331539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.331590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.332682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.334077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.334135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.335241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.336214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.336489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.340307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.340364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.341950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.342338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.342751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.342808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.344220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.344273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.344617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.347320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.347376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.348209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.348258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.349834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.350893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.350946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.352157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.352513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.356082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.357614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.357669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.359286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.360146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.360201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.361262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.361308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.361625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.366142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.366200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.366943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.366988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.367847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.367910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.369457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.369511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.369835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.373783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.373854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.375240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.375286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.376241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.376305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.377589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.377637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.377975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.381330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.381387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.382235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.382281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.384017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.384075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.385455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.385500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.385809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.390221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.390278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.390661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.390705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.392353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.392422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.393738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.393783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.394092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.397467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.397520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.399081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.399126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.399765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.399819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.400313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.400360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.400631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.405752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.405811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.405852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.405893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.407643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.407699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.407740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.407781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.408126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.410693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.410745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.410786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.410827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.411279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.411326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.411367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.411409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.411729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.412696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.412748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.412790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.412831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.413323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.413371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.413412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.413461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.413868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.414872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.414923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.414978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.415833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.416793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.418223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.418272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.419765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.420234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.420634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.420682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.421074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.421345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.422384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.423712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.423760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.424594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.425016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.426349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.426398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.427724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.428001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.429409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.430937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.430989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.432406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.434410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.434458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.435842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.436159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.437208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.438191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.438241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.439230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.439644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.440349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.440397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.441799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.442119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.443163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.444007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.444057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.445351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.445761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.447091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.447140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.448324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.448597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.449902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.450427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.450476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.451762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.452177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.453508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.215 [2024-07-12 22:40:15.453558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.454749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.455030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.456084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.457400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.457448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.457833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.458328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.459517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.459566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.460856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.461197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.462195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.463538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.463587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.464894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.465326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.466749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.466797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.467194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.467655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.468593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.470165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.470221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.471880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.472479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.474145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.474200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.475799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.476077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.477140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.477197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.478841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.478895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.479310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.479362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.480886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.480950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.481220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.483498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.483557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.484869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.484915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.485378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.485770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.485814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.485855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.486137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.487123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.488457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.488507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.488547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.490411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.490468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.490510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.491834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.492114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.493516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.493571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.493617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.495086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.495495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.495557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.497015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.497065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.497333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.498282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.498333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.499654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.499701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.500153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.500663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.500709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.500751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.501164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.504540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.505582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.505633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.505674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.507579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.507638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.507680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.509012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.509282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.513253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.513313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.513355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.514737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.515195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.515244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.516554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.516601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.516922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.517833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.517886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.518288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.518333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.518768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.520073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.520120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.520162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.520473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.521373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.523008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.523056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.523100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.525109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.525168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.525213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.526574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.527080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.529413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.529470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.529513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.531039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.531490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.531547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.532658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.532706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.533043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.533988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.534039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.534830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.534880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.535337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.536301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.536349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.216 [2024-07-12 22:40:15.536391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.536729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.537672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.538687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.538737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.538779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.540651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.542200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.542254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.543687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.544154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.546405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.547722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.547772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.549088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.549547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.549598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.550894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.550947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.551272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.552164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.552220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.553748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.553793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.554336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.555378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.555425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.556378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.556686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.557599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.558904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.558958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.560606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.561027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.562569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.562625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.564277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.564669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.565601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.566577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.566630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.567502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.567913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.569148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.569199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.569605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.569977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.570958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.572624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.572674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.573657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.574109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.575329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.575380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.576984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.577453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.578517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.580094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.580165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.581714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.582214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.583370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.583423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.584812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.585194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.587012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.588406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.588462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.589654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.591334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.591735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.591782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.592184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.592455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.593443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.594398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.595890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.597308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.597763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.598252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.478 [2024-07-12 22:40:15.598643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.599259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.599531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.603865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.604268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.604935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.606083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.607860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.608276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.608669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.609648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.609971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.614035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.614434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.615085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.616384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.618288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.619246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.620559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.621882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.622164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.624293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.625840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.626803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.627365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.629029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.629423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.629810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.631279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.631640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.632944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.633000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.633384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.633427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.634289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.634348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.634735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.634792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.635285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.636693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.636751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.637147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.637190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.638058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.638117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.638500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.638543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.639045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.640404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.640461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.640845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.640888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.641769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.641827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.642224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.642268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.642741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.644090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.644150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.644536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.644578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.645443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.645508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.645897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.645949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.646354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.647762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.647821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.648218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.648267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.649075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.649141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.649535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.649580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.649980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.651378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.651437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.651822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.651866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.652671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.652730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.653149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.653199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.653561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.654916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.654985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.655371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.655415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.656225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.656289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.656682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.656730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.657078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.658428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.658487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.658872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.658914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.659684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.659749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.660150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.660199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.660543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.661912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.661980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.662361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.662403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.663189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.663266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.663654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.663701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.664036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.665443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.665838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.665887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.666287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.667159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.667552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.667600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.667996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.668429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.670484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.671211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.671270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.672519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.673415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.673469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.674152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.675260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.675534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.678100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.678172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.678562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.678955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.679366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.680343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.681239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.681289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.681562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.682574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.682982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.683471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.683517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.685480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.686259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.686308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.687256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.687531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.691102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.692148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.692199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.693155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.693908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.693978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.694365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.695743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.696064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.698218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.698275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.698853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.699249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.699740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.701116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.702672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.702720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.703036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.703941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.704339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.704757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.704801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.706272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.707820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.707871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.708564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.708844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.710082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.710481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.710530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.711359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.712458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.712516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.713446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.713837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.714278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.716733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.716792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.717552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.719035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.719515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.721080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.721480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.721524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.721916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.722912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.723335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.723722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.723773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.725726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.726409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.726460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.727808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.728091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.730761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.732164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.732213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.733757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.734855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.734913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.736109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.479 [2024-07-12 22:40:15.737489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.737767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.739144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.739198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.740748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.742401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.742814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.742874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.744236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.744281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.744550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.745526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.745579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.747135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.747187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.748081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.749069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.749118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.750420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.750756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.753062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.754504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.754556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.756103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.756855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.756909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.757305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.757349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.757616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.760117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.760175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.760817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.760862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.762716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.762773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.764326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.764375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.764673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.767687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.767745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.769312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.769371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.771246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.771304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.772637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.772684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.773023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.774688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.774743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.775132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.775176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.776834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.776890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.778226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.778274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.778544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.780740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.780799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.782266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.782315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.783079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.783134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.783518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.783562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.783834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.784849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.784901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.786455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.786510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.787005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.787053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.788343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.788388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.788690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.789921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.789989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.790031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.790073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.791741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.791796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.791838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.791879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.792199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.793758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.794092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.794984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.795987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.796980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.797939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.798936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.800482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.480 [2024-07-12 22:40:15.800530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.800969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.801449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.802447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.802495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.803794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.804125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.805002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.806660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.806710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.808280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.808705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.810188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.810236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.810621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.811093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.812035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.813345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.813394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.814923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.815415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.816701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.816749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.818061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.818338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.819317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.819715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.742 [2024-07-12 22:40:15.819764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.821223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.821634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.823259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.823309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.824596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.824870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.825809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.827373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.827422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.828030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.828524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.829321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.829370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.830667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.830995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.831862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.833328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.833379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.834901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.835352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.836995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.837052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.837439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.837844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.838789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.840119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.840172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.841721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.842223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.843539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.843589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.844908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.845187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.846155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.846557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.846611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.848100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.848515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.850177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.850227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.851544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.851818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.852741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.854304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.854353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.855017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.855542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.856312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.856361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.857651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.858009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.858876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.858938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.860545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.860599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.861017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.861068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.862617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.862665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.862947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.865239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.865296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.866715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.866763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.867191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.868862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.868911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.868961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.869231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.870155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.871723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.871773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.871815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.872686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.872738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.872780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.873825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.874180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.876275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.876331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.876373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.878025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.878436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.878492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.880094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.880141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.880413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.881416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.881472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.883089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.883143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.883553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.885166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.885223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.885264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.885534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.886463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.887789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.887838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.887879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.888913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.888974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.889017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.889402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.889676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.892068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.892125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.892167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.893082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.893496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.893546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.895128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.895191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.895460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.896452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.896506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.896896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.896947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.897360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.898462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.898512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.898554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.898976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.899911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.900321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.900370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.900412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.901907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.901971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.902013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.902995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.903336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.905291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.905363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.905406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.905789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.906312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.906362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.907394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.907443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.907711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.908668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.908720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.909890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.909946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.910564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.910970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.911020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.911061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.911335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.912370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.913533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.913586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.913627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.914387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.914782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.914830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.916335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.916696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.918983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.920070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.920120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.921658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.922146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.922197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.922584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.922628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.922961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.923828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.923879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.925020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.925081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.925531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.926169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.926216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.926602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.926897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.927908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.929454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.929518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.929911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.930393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.931512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.931560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.932417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.932691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.933619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.934736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.934797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.935184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.743 [2024-07-12 22:40:15.935788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.937424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.937479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.939108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.939378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.940372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.941934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.941982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.943151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.943749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.944153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.944213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.945319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.945675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.946675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.947087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.947133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.947555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.947974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.949428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.949481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.951046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.951319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.953489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.955063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.955111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.955985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.956942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.958393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.958443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.959884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.960258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.961193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.961712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.962107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.962496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.963000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.963404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.963795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.964191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.964548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.966091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.966499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.966888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.967285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.968244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.968642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.969038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.969426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.969828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.971251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.971653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.972049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.972458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.973291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.973707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.974108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.974497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.974862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.976322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.976714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.977127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.977523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.978453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.978851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.979249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.979643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.980053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.981469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.981527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.981912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.981964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.982762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.982842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.983244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.983291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.983706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.985099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.985158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.985541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.985586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.986415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.986487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.986880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.986933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.987336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.988758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.988816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.989208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.989252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.990095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.990178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.990566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.990610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.991030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.992419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.992478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.992859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.992903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.993771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.993836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.994229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.994274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.994679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.996080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.996139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.996522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.996565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.998463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.998519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.999069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.999117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:15.999395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.001339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.001397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.002586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.002634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.004521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.004578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.005965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.006012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.006447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.008885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.008956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.010432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.010477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.012114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.012171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.013212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.013273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.013722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.016155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.016229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.016766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.016817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.018615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.018680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.019075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.019119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.019527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.021853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.021910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.022955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.023003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.024378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.024437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.024821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.024869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.025232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.027202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.027920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.027974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.028358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.030337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.031780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.031833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.032554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.032855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.033839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.034249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.034300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.035381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.037028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.037085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.038070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.039669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.039973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.041274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.041333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.041729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.043147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.043573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.044182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.045354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.045405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.045675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.046756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.047166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.048704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.048752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.049596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.051147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.051197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.052162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.052616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.054124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.055487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.055537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.055932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.056882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.744 [2024-07-12 22:40:16.056945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.057889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.058779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.059144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.060426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.060485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.062040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.063072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.063533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.063938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.064330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.064384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.745 [2024-07-12 22:40:16.064660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.065628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.067293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.068889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.068956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.069795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.070988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.071039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.072327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.072650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.075182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.076780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.076837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.078490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.079261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.079319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.079708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.080987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.081325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.083532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.083588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.085252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.086845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.087275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.088734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.089135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.089184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.089546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.090426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.091755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.093088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.093135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.095216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.096873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.007 [2024-07-12 22:40:16.096922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.098513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.098782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.100665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.101977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.102025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.103343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.104840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.104899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.106376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.108038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.108311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.109641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.109696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.110695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.112003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.112462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.112511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.113825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.113872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.114181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.115050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.115107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.116710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.116765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.117521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.117912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.117966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.119590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.119875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.121761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.123196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.123243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.124591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.126550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.126613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.127008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.127056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.127542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.129760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.129817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.131147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.131193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.132896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.132957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.134282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.134327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.134631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.135915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.135981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.137599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.137651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.139440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.139506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.141078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.141145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.141418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.143702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.143762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.145094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.145151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.146118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.146179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.147636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.147684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.147958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.149690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.149747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.151047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.151093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.152816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.152873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.153870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.153920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.154340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.155401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.155458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.156853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.156899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.157315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.157364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.158601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.158645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.158915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.161165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.161222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.161264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.161305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.162110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.162168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.162215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.162256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.162527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.163472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.163523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.163564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.163605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.008 [2024-07-12 22:40:16.164055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.164103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.164152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.164195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.164464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.165380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.165431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.165473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.165514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.165971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.166019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.166061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.166123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.166484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.167999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.168041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.168088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.168359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.169293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.170623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.170671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.171996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.172455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.173309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.173360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.173746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.174078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.174938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.176544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.176598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.178098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.178511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.179830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.179879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.181180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.181501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.182767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.183292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.183341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.184640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.185055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.186383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.186432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.187627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.187903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.188874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.190201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.190249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.190862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.191337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.192182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.192230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.193527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.193835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.194705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.196113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.196161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.197621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.198108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.199431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.199480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.199868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.200190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.201059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.202387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.202434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.203750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.204165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.205831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.205881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.207483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.207756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.208740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.209151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.209203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.210202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.210654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.211984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.212035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.213354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.213686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.214614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.216041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.216099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.217678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.218225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.218624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.218668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.220202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.220508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.221491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.222324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.222374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.223671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.224090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.009 [2024-07-12 22:40:16.225479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.225528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.226808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.227291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.228431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.228488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.230109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.230154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.230560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.230613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.232047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.232092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.232365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.234691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.234749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.235327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.235376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.235800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.236476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.236526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.236568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.236912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.237838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.238817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.238868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.238910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.239732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.239788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.239831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.240352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.240625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.243103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.243169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.243211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.244772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.245326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.245393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.245778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.245833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.246111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.247106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.247170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.248757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.248805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.249237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.249645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.249691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.249733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.250193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.251250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.252287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.252346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.252388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.253873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.253935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.253977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.254369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.254687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.257038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.257110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.257152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.258186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.258672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.258720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.259388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.259438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.259846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.260909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.260969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.262515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.262574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.263019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.263982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.264032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.264073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.264407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.266022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.267528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.267587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.267628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.268811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.268868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.268909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.269757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.270045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.271383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.271441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.271481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.272612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.273134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.273183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.274150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.274213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.274687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.276031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.276088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.277519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.277565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.277981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.279083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.010 [2024-07-12 22:40:16.279133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.279174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.279492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.280532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.280945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.280996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.281037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.282916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.284318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.284367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.284768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.285043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.286376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.287314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.287366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.288659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.289114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.289164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.290479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.290527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.290846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.291767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.291823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.293196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.293241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.293650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.294072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.294144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.294537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.294963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.295879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.296451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.296501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.298031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.298459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.298864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.298922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.299320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.299590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.300620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.301031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.301089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.301474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.302048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.302447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.302507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.302898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.303377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.304825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.305236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.305287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.305677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.306268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.306665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.306713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.307110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.307538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.308475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.308878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.308943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.309328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.309783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.310193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.310251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.310639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.311022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.312537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.312946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.313003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.313402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.314372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.314773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.314829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.315226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.315571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.316591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.317002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.317397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.011 [2024-07-12 22:40:16.317789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.318325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.318722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.319125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.319513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.319905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.321340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.321738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.322138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.322533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.323333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.323737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.324142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.324536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.324996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.326466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.326870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.327272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.327668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.328552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.328980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.012 [2024-07-12 22:40:16.329376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.329772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.330184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.331761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.332170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.332566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.332969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.333775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.334185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.334578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.334989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.335357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.338158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.338223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.339834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.339889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.341621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.341678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.342949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.342994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.343478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.345819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.345877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.346592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.346647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.348309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.348366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.348755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.348802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.349113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.351627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.351685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.352503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.352552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.354156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.354219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.354611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.354667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.355104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.356800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.356864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.358523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.358566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.359326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.359383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.359768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.359817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.360129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.362367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.362427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.362815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.362864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.364525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.364583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.365553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.365603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.365943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.367653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.367717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.368110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.368164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.369922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.369984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.371542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.371587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.371857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.374341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.374398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.375274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.375329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.376232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.376295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.377960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.378007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.378279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.380899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.380970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.381358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.381409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.382953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.383010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.383535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.383583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.383855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.274 [2024-07-12 22:40:16.385244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.385304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.386643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.386689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.388460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.388517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.388997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.389050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.389323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.390556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.390965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.391016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.391403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.392246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.392647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.392696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.393091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.393368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.394300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.394833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.394882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.396194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.397808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.397864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.398265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.398657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.399062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.400936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.401001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.402484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.403883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.404417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.406089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.407708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.407764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.408099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.409048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.410254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.411641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.411689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.413064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.414725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.414783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.416415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.416690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.418029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.418673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.418722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.420013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.421932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.421990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.423043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.424567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.424909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.426492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.426552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.426945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.427607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.428024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.429340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.430887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.430939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.431231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.432227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.433591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.435130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.435183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.436013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.436937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.436988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.438283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.438607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.441056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.442582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.442637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.444254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.445021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.445079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.445466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.446573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.446894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.448822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.448878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.450424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.452091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.452501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.454077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.454469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.454519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.454908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.455902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.457202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.458741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.458787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.460551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.461966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.462015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.463544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.463818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.466158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.275 [2024-07-12 22:40:16.467470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.467519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.468828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.469828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.469886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.471197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.472512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.472787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.474121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.474180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.475654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.477040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.477486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.477535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.479163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.479218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.479535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.480512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.480564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.482092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.482139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.482978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.483440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.483490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.484777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.485054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.486977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.488288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.488337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.489660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.490822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.490880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.491275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.491325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.491671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.494166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.494226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.495465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.495510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.497440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.497498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.499032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.499079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.499403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.502407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.502472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.504094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.504140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.506089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.506147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.507349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.507396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.507716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.509428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.509487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.509874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.509920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.511625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.511683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.513003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.513051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.513322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.515496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.515554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.516888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.516940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.517694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.517750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.518148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.518199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.518474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.519468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.519520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.520971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.521018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.521584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.521632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.522920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.522969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.523240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.524542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.524601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.524643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.524700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.526836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.526898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.526955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.527000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.527268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.528882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.529158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.530112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.530164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.276 [2024-07-12 22:40:16.530205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.530274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.530900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.530955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.530998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.531039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.531376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.532983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.533426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.534355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.535916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.535968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.536968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.537572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.537990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.538044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.539487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.539759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.540688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.541639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.541688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.542994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.543461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.545015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.545064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.545598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.545992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.547109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.548541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.548590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.550124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.550531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.551911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.551963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.553408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.553744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.554777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.555182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.555234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.556127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.556588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.557894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.557946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.559481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.559841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.560784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.562380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.562441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.563883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.564374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.564776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.564824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.565871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.566215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.567212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.568422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.568472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.569014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.569552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.569957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.570010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.571424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.571712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.572773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.574225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.574273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.574661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.575207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.576262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.576313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.577507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.577850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.578917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.579537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.579586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.579981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.580480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.582044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.582102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.583708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.584080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.585036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.585435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.585487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.585875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.586346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.587664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.587713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.588423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.588756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.589771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.589823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.590223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.277 [2024-07-12 22:40:16.590273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.590723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.590770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.591724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.591772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.592050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.594158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.594216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.594621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.594688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.278 [2024-07-12 22:40:16.595334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.538 [2024-07-12 22:40:16.596879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.538 [2024-07-12 22:40:16.596953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.538 [2024-07-12 22:40:16.597001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.597272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.598359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.599903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.599956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.599998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.600902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.600965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.601008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.602071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.602391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.603811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.603888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.603950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.604337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.604749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.604803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.606148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.606193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.606463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.607507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.607558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.608886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.608937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.609460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.609857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.609905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.609954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.610250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.611198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.611604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.611653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.611694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.613247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.613302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.613359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.613749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.614209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.616431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.616491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.616550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.618021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.618460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.618508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.619792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.619839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.620176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.621184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.621236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.621628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.621676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.622100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.623777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.623827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.623869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.624209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.625208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.625613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.625662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.625704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.627396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.627458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.627508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.629162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.629432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.631715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.631776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.631817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.633376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.633862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.633918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.634314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.634362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.634686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.539 [2024-07-12 22:40:16.635635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.635694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.636245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.636294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.636720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.637963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.638009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.638050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.638524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.639764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.640176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.640247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.640302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.641256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.641671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.641720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.642113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.642593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.644366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.644770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.644819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.645229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.645729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.645779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.646180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.646229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.646657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.647748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.647801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.648216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.648265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.648861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.649280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.649326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.649718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.650101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.651183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.651581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.651630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.652027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.652531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.652933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.652981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.653370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.653734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.654746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.655156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.655208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.655601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.656136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.656534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.656581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.656978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.657299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.658381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.658783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.658832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.659228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.659722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.660128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.660181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.660568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.660924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.662019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.662419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.662488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.662878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.663368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.663761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.663809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.664208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.664626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.666145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.666552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.666606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.667018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.667969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.668372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.668426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.668819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.669298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.670698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.671107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.540 [2024-07-12 22:40:16.671512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.671902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.672433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.673264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.674219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.675082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.675564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.677851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.678491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.680013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.681343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.682189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.682853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.684008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.685586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.685924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.687238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.687641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.688824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.689785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.691356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.692327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.693314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.693729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.694010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.696464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.698039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.698449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.698845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.700649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.702237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.702853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.704069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.704344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.706451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.706509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.707697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.707746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.709675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.709736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.711212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.711268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.711643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.714146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.714211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.715598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.715645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.716558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.716959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.541 [2024-07-12 22:40:16.717003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.801 00:33:06.801 Latency(us) 00:33:06.801 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:06.801 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x0 length 0x100 00:33:06.801 crypto_ram : 5.77 44.33 2.77 0.00 0.00 2804972.86 60635.05 2567643.49 00:33:06.801 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x100 length 0x100 00:33:06.801 crypto_ram : 5.73 44.65 2.79 0.00 0.00 2775130.82 77503.44 2465521.31 00:33:06.801 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x0 length 0x100 00:33:06.801 crypto_ram2 : 5.78 44.33 2.77 0.00 0.00 2709356.19 60179.14 2582232.38 00:33:06.801 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x100 length 0x100 00:33:06.801 crypto_ram2 : 5.73 44.64 2.79 0.00 0.00 2682944.11 77047.54 2421754.66 00:33:06.801 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x0 length 0x100 00:33:06.801 crypto_ram3 : 5.58 286.33 17.90 0.00 0.00 400785.79 2792.40 579908.12 00:33:06.801 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x100 length 0x100 00:33:06.801 crypto_ram3 : 5.56 299.54 18.72 0.00 0.00 383612.61 14189.97 579908.12 00:33:06.801 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x0 length 0x100 00:33:06.801 crypto_ram4 : 5.65 300.57 18.79 0.00 0.00 371504.49 11226.60 485080.38 00:33:06.801 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:06.801 Verification LBA range: start 0x100 length 0x100 00:33:06.801 crypto_ram4 : 5.63 315.94 19.75 0.00 0.00 355138.18 27810.06 488727.60 00:33:06.801 =================================================================================================================== 00:33:06.801 Total : 1380.33 86.27 0.00 0.00 689087.09 2792.40 2582232.38 00:33:07.371 00:33:07.371 real 0m9.036s 00:33:07.371 user 0m17.114s 00:33:07.371 sys 0m0.455s 00:33:07.371 22:40:17 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:07.371 22:40:17 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:07.371 ************************************ 00:33:07.371 END TEST bdev_verify_big_io 00:33:07.371 ************************************ 00:33:07.371 22:40:17 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:07.371 22:40:17 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:07.371 22:40:17 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:07.371 22:40:17 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:07.371 22:40:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:07.371 ************************************ 00:33:07.371 START TEST bdev_write_zeroes 00:33:07.371 ************************************ 00:33:07.371 22:40:17 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:07.371 [2024-07-12 22:40:17.682565] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:07.371 [2024-07-12 22:40:17.682629] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3610462 ] 00:33:07.630 [2024-07-12 22:40:17.810879] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:07.630 [2024-07-12 22:40:17.907457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:07.630 [2024-07-12 22:40:17.928754] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:07.630 [2024-07-12 22:40:17.936781] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:07.630 [2024-07-12 22:40:17.944802] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:07.889 [2024-07-12 22:40:18.052634] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:10.453 [2024-07-12 22:40:20.279569] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:10.453 [2024-07-12 22:40:20.279648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:10.453 [2024-07-12 22:40:20.279663] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.453 [2024-07-12 22:40:20.287588] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:10.453 [2024-07-12 22:40:20.287608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:10.454 [2024-07-12 22:40:20.287620] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.454 [2024-07-12 22:40:20.295609] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:10.454 [2024-07-12 22:40:20.295628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:10.454 [2024-07-12 22:40:20.295639] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.454 [2024-07-12 22:40:20.303628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:10.454 [2024-07-12 22:40:20.303647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:10.454 [2024-07-12 22:40:20.303659] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.454 Running I/O for 1 seconds... 00:33:11.392 00:33:11.392 Latency(us) 00:33:11.392 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:11.392 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:11.392 crypto_ram : 1.03 1956.63 7.64 0.00 0.00 64841.29 5442.34 77959.35 00:33:11.392 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:11.392 crypto_ram2 : 1.03 1969.86 7.69 0.00 0.00 64125.48 5413.84 72488.51 00:33:11.392 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:11.392 crypto_ram3 : 1.02 15046.71 58.78 0.00 0.00 8371.94 2478.97 10827.69 00:33:11.392 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:11.392 crypto_ram4 : 1.02 15083.50 58.92 0.00 0.00 8325.28 2478.97 8719.14 00:33:11.392 =================================================================================================================== 00:33:11.392 Total : 34056.70 133.03 0.00 0.00 14849.07 2478.97 77959.35 00:33:11.651 00:33:11.651 real 0m4.193s 00:33:11.651 user 0m3.790s 00:33:11.651 sys 0m0.352s 00:33:11.651 22:40:21 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:11.651 22:40:21 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:11.651 ************************************ 00:33:11.651 END TEST bdev_write_zeroes 00:33:11.651 ************************************ 00:33:11.651 22:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:11.651 22:40:21 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:11.651 22:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:11.651 22:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:11.651 22:40:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:11.651 ************************************ 00:33:11.651 START TEST bdev_json_nonenclosed 00:33:11.651 ************************************ 00:33:11.651 22:40:21 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:11.651 [2024-07-12 22:40:21.953816] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:11.652 [2024-07-12 22:40:21.953874] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3611004 ] 00:33:11.911 [2024-07-12 22:40:22.080342] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:11.911 [2024-07-12 22:40:22.177113] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:11.911 [2024-07-12 22:40:22.177180] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:11.911 [2024-07-12 22:40:22.177201] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:11.911 [2024-07-12 22:40:22.177213] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:12.171 00:33:12.171 real 0m0.388s 00:33:12.171 user 0m0.240s 00:33:12.171 sys 0m0.144s 00:33:12.171 22:40:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:12.171 22:40:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:12.171 22:40:22 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:12.171 ************************************ 00:33:12.171 END TEST bdev_json_nonenclosed 00:33:12.171 ************************************ 00:33:12.171 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:12.171 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:12.171 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:12.171 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:12.171 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:12.171 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:12.171 ************************************ 00:33:12.171 START TEST bdev_json_nonarray 00:33:12.171 ************************************ 00:33:12.171 22:40:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:12.171 [2024-07-12 22:40:22.409531] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:12.171 [2024-07-12 22:40:22.409591] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3611030 ] 00:33:12.430 [2024-07-12 22:40:22.536949] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:12.430 [2024-07-12 22:40:22.644572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:12.430 [2024-07-12 22:40:22.644649] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:12.430 [2024-07-12 22:40:22.644671] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:12.430 [2024-07-12 22:40:22.644683] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:12.430 00:33:12.430 real 0m0.403s 00:33:12.430 user 0m0.244s 00:33:12.430 sys 0m0.156s 00:33:12.430 22:40:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:12.430 22:40:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:12.430 22:40:22 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:12.430 ************************************ 00:33:12.430 END TEST bdev_json_nonarray 00:33:12.430 ************************************ 00:33:12.700 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:12.700 22:40:22 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:12.700 00:33:12.700 real 1m11.721s 00:33:12.700 user 2m39.450s 00:33:12.700 sys 0m8.946s 00:33:12.700 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:12.700 22:40:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:12.700 ************************************ 00:33:12.700 END TEST blockdev_crypto_aesni 00:33:12.700 ************************************ 00:33:12.700 22:40:22 -- common/autotest_common.sh@1142 -- # return 0 00:33:12.700 22:40:22 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:12.700 22:40:22 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:12.700 22:40:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:12.700 22:40:22 -- common/autotest_common.sh@10 -- # set +x 00:33:12.700 ************************************ 00:33:12.700 START TEST blockdev_crypto_sw 00:33:12.700 ************************************ 00:33:12.700 22:40:22 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:12.700 * Looking for test storage... 00:33:12.700 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:12.700 22:40:22 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3611256 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:12.700 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 3611256 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 3611256 ']' 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:12.700 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:12.700 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:12.961 [2024-07-12 22:40:23.083698] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:12.961 [2024-07-12 22:40:23.083773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3611256 ] 00:33:12.961 [2024-07-12 22:40:23.204610] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:13.221 [2024-07-12 22:40:23.307309] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:13.789 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:13.789 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:13.789 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:13.789 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:13.789 22:40:23 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:13.789 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:13.789 22:40:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.102 Malloc0 00:33:14.102 Malloc1 00:33:14.102 true 00:33:14.102 true 00:33:14.102 true 00:33:14.102 [2024-07-12 22:40:24.209678] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:14.102 crypto_ram 00:33:14.102 [2024-07-12 22:40:24.217706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:14.102 crypto_ram2 00:33:14.102 [2024-07-12 22:40:24.225732] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:14.102 crypto_ram3 00:33:14.102 [ 00:33:14.102 { 00:33:14.102 "name": "Malloc1", 00:33:14.102 "aliases": [ 00:33:14.102 "effd2c2a-2da6-42db-ad96-054a907b9088" 00:33:14.102 ], 00:33:14.102 "product_name": "Malloc disk", 00:33:14.102 "block_size": 4096, 00:33:14.102 "num_blocks": 4096, 00:33:14.102 "uuid": "effd2c2a-2da6-42db-ad96-054a907b9088", 00:33:14.102 "assigned_rate_limits": { 00:33:14.102 "rw_ios_per_sec": 0, 00:33:14.102 "rw_mbytes_per_sec": 0, 00:33:14.102 "r_mbytes_per_sec": 0, 00:33:14.102 "w_mbytes_per_sec": 0 00:33:14.102 }, 00:33:14.102 "claimed": true, 00:33:14.102 "claim_type": "exclusive_write", 00:33:14.102 "zoned": false, 00:33:14.102 "supported_io_types": { 00:33:14.102 "read": true, 00:33:14.102 "write": true, 00:33:14.102 "unmap": true, 00:33:14.102 "flush": true, 00:33:14.102 "reset": true, 00:33:14.102 "nvme_admin": false, 00:33:14.102 "nvme_io": false, 00:33:14.102 "nvme_io_md": false, 00:33:14.102 "write_zeroes": true, 00:33:14.102 "zcopy": true, 00:33:14.102 "get_zone_info": false, 00:33:14.102 "zone_management": false, 00:33:14.102 "zone_append": false, 00:33:14.102 "compare": false, 00:33:14.102 "compare_and_write": false, 00:33:14.102 "abort": true, 00:33:14.102 "seek_hole": false, 00:33:14.102 "seek_data": false, 00:33:14.102 "copy": true, 00:33:14.102 "nvme_iov_md": false 00:33:14.102 }, 00:33:14.102 "memory_domains": [ 00:33:14.102 { 00:33:14.102 "dma_device_id": "system", 00:33:14.102 "dma_device_type": 1 00:33:14.102 }, 00:33:14.102 { 00:33:14.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:14.102 "dma_device_type": 2 00:33:14.102 } 00:33:14.102 ], 00:33:14.102 "driver_specific": {} 00:33:14.102 } 00:33:14.102 ] 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:14.102 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:14.102 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3205a69-c59f-5b10-9816-1c144e112c4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3205a69-c59f-5b10-9816-1c144e112c4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "44ccc16b-ef44-5e7b-9c2c-2448c7826752"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "44ccc16b-ef44-5e7b-9c2c-2448c7826752",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:14.376 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 3611256 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 3611256 ']' 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 3611256 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3611256 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3611256' 00:33:14.376 killing process with pid 3611256 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 3611256 00:33:14.376 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 3611256 00:33:14.636 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:14.636 22:40:24 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:14.636 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:14.636 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:14.636 22:40:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.636 ************************************ 00:33:14.636 START TEST bdev_hello_world 00:33:14.636 ************************************ 00:33:14.636 22:40:24 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:14.895 [2024-07-12 22:40:24.978005] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:14.895 [2024-07-12 22:40:24.978064] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3611464 ] 00:33:14.895 [2024-07-12 22:40:25.107096] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:14.895 [2024-07-12 22:40:25.214210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:15.155 [2024-07-12 22:40:25.397945] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:15.155 [2024-07-12 22:40:25.398005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:15.155 [2024-07-12 22:40:25.398019] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.155 [2024-07-12 22:40:25.405965] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:15.155 [2024-07-12 22:40:25.405984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:15.155 [2024-07-12 22:40:25.405996] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.155 [2024-07-12 22:40:25.413984] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:15.155 [2024-07-12 22:40:25.414003] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:15.155 [2024-07-12 22:40:25.414023] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.155 [2024-07-12 22:40:25.454694] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:15.155 [2024-07-12 22:40:25.454731] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:15.155 [2024-07-12 22:40:25.454750] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:15.155 [2024-07-12 22:40:25.456816] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:15.155 [2024-07-12 22:40:25.456888] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:15.155 [2024-07-12 22:40:25.456904] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:15.155 [2024-07-12 22:40:25.456946] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:15.155 00:33:15.155 [2024-07-12 22:40:25.456964] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:15.414 00:33:15.414 real 0m0.747s 00:33:15.414 user 0m0.485s 00:33:15.414 sys 0m0.240s 00:33:15.414 22:40:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.414 22:40:25 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:15.414 ************************************ 00:33:15.414 END TEST bdev_hello_world 00:33:15.414 ************************************ 00:33:15.414 22:40:25 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:15.414 22:40:25 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:15.414 22:40:25 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:15.414 22:40:25 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.414 22:40:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:15.673 ************************************ 00:33:15.673 START TEST bdev_bounds 00:33:15.673 ************************************ 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3611655 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3611655' 00:33:15.673 Process bdevio pid: 3611655 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3611655 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3611655 ']' 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:15.673 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:15.673 22:40:25 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:15.673 [2024-07-12 22:40:25.847454] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:15.673 [2024-07-12 22:40:25.847590] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3611655 ] 00:33:15.933 [2024-07-12 22:40:26.039781] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:15.933 [2024-07-12 22:40:26.138409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:15.933 [2024-07-12 22:40:26.138494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:15.933 [2024-07-12 22:40:26.138500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:16.192 [2024-07-12 22:40:26.307080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:16.192 [2024-07-12 22:40:26.307160] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:16.192 [2024-07-12 22:40:26.307175] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:16.192 [2024-07-12 22:40:26.315117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:16.192 [2024-07-12 22:40:26.315135] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:16.192 [2024-07-12 22:40:26.315147] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:16.192 [2024-07-12 22:40:26.323126] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:16.192 [2024-07-12 22:40:26.323144] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:16.192 [2024-07-12 22:40:26.323155] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:16.452 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:16.452 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:16.452 22:40:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:16.712 I/O targets: 00:33:16.712 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:16.712 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:16.712 00:33:16.712 00:33:16.712 CUnit - A unit testing framework for C - Version 2.1-3 00:33:16.712 http://cunit.sourceforge.net/ 00:33:16.712 00:33:16.712 00:33:16.712 Suite: bdevio tests on: crypto_ram3 00:33:16.712 Test: blockdev write read block ...passed 00:33:16.712 Test: blockdev write zeroes read block ...passed 00:33:16.712 Test: blockdev write zeroes read no split ...passed 00:33:16.712 Test: blockdev write zeroes read split ...passed 00:33:16.712 Test: blockdev write zeroes read split partial ...passed 00:33:16.712 Test: blockdev reset ...passed 00:33:16.712 Test: blockdev write read 8 blocks ...passed 00:33:16.712 Test: blockdev write read size > 128k ...passed 00:33:16.712 Test: blockdev write read invalid size ...passed 00:33:16.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:16.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:16.712 Test: blockdev write read max offset ...passed 00:33:16.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:16.712 Test: blockdev writev readv 8 blocks ...passed 00:33:16.712 Test: blockdev writev readv 30 x 1block ...passed 00:33:16.712 Test: blockdev writev readv block ...passed 00:33:16.712 Test: blockdev writev readv size > 128k ...passed 00:33:16.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:16.712 Test: blockdev comparev and writev ...passed 00:33:16.712 Test: blockdev nvme passthru rw ...passed 00:33:16.712 Test: blockdev nvme passthru vendor specific ...passed 00:33:16.712 Test: blockdev nvme admin passthru ...passed 00:33:16.712 Test: blockdev copy ...passed 00:33:16.712 Suite: bdevio tests on: crypto_ram 00:33:16.712 Test: blockdev write read block ...passed 00:33:16.712 Test: blockdev write zeroes read block ...passed 00:33:16.712 Test: blockdev write zeroes read no split ...passed 00:33:16.712 Test: blockdev write zeroes read split ...passed 00:33:16.712 Test: blockdev write zeroes read split partial ...passed 00:33:16.712 Test: blockdev reset ...passed 00:33:16.712 Test: blockdev write read 8 blocks ...passed 00:33:16.712 Test: blockdev write read size > 128k ...passed 00:33:16.712 Test: blockdev write read invalid size ...passed 00:33:16.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:16.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:16.712 Test: blockdev write read max offset ...passed 00:33:16.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:16.712 Test: blockdev writev readv 8 blocks ...passed 00:33:16.712 Test: blockdev writev readv 30 x 1block ...passed 00:33:16.712 Test: blockdev writev readv block ...passed 00:33:16.712 Test: blockdev writev readv size > 128k ...passed 00:33:16.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:16.712 Test: blockdev comparev and writev ...passed 00:33:16.712 Test: blockdev nvme passthru rw ...passed 00:33:16.712 Test: blockdev nvme passthru vendor specific ...passed 00:33:16.712 Test: blockdev nvme admin passthru ...passed 00:33:16.712 Test: blockdev copy ...passed 00:33:16.712 00:33:16.712 Run Summary: Type Total Ran Passed Failed Inactive 00:33:16.712 suites 2 2 n/a 0 0 00:33:16.713 tests 46 46 46 0 0 00:33:16.713 asserts 260 260 260 0 n/a 00:33:16.713 00:33:16.713 Elapsed time = 0.083 seconds 00:33:16.713 0 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3611655 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3611655 ']' 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3611655 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3611655 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3611655' 00:33:16.713 killing process with pid 3611655 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3611655 00:33:16.713 22:40:26 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3611655 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:16.973 00:33:16.973 real 0m1.443s 00:33:16.973 user 0m3.488s 00:33:16.973 sys 0m0.444s 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:16.973 ************************************ 00:33:16.973 END TEST bdev_bounds 00:33:16.973 ************************************ 00:33:16.973 22:40:27 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:16.973 22:40:27 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:16.973 22:40:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:16.973 22:40:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:16.973 22:40:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:16.973 ************************************ 00:33:16.973 START TEST bdev_nbd 00:33:16.973 ************************************ 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3611861 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3611861 /var/tmp/spdk-nbd.sock 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3611861 ']' 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:16.973 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:16.973 22:40:27 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:17.233 [2024-07-12 22:40:27.345824] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:17.233 [2024-07-12 22:40:27.345889] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:17.233 [2024-07-12 22:40:27.472854] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:17.492 [2024-07-12 22:40:27.575907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:17.492 [2024-07-12 22:40:27.746331] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:17.492 [2024-07-12 22:40:27.746400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:17.492 [2024-07-12 22:40:27.746416] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.492 [2024-07-12 22:40:27.754351] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:17.492 [2024-07-12 22:40:27.754371] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:17.492 [2024-07-12 22:40:27.754383] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:17.492 [2024-07-12 22:40:27.762373] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:17.492 [2024-07-12 22:40:27.762391] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:17.492 [2024-07-12 22:40:27.762403] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:18.062 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:18.322 1+0 records in 00:33:18.322 1+0 records out 00:33:18.322 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255193 s, 16.1 MB/s 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:18.322 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:18.581 1+0 records in 00:33:18.581 1+0 records out 00:33:18.581 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031824 s, 12.9 MB/s 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:18.581 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:18.582 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:18.841 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:18.841 { 00:33:18.841 "nbd_device": "/dev/nbd0", 00:33:18.841 "bdev_name": "crypto_ram" 00:33:18.841 }, 00:33:18.841 { 00:33:18.841 "nbd_device": "/dev/nbd1", 00:33:18.841 "bdev_name": "crypto_ram3" 00:33:18.841 } 00:33:18.841 ]' 00:33:18.841 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:18.841 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:18.841 { 00:33:18.841 "nbd_device": "/dev/nbd0", 00:33:18.841 "bdev_name": "crypto_ram" 00:33:18.841 }, 00:33:18.841 { 00:33:18.841 "nbd_device": "/dev/nbd1", 00:33:18.841 "bdev_name": "crypto_ram3" 00:33:18.841 } 00:33:18.841 ]' 00:33:18.841 22:40:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:18.841 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:19.100 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:19.100 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:19.100 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:19.100 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:19.100 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:19.101 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:19.101 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:19.101 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:19.101 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:19.101 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:19.360 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:19.619 22:40:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:19.878 /dev/nbd0 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:19.878 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:19.879 1+0 records in 00:33:19.879 1+0 records out 00:33:19.879 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276027 s, 14.8 MB/s 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:19.879 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:20.138 /dev/nbd1 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:20.138 1+0 records in 00:33:20.138 1+0 records out 00:33:20.138 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334041 s, 12.3 MB/s 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:20.138 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:20.398 { 00:33:20.398 "nbd_device": "/dev/nbd0", 00:33:20.398 "bdev_name": "crypto_ram" 00:33:20.398 }, 00:33:20.398 { 00:33:20.398 "nbd_device": "/dev/nbd1", 00:33:20.398 "bdev_name": "crypto_ram3" 00:33:20.398 } 00:33:20.398 ]' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:20.398 { 00:33:20.398 "nbd_device": "/dev/nbd0", 00:33:20.398 "bdev_name": "crypto_ram" 00:33:20.398 }, 00:33:20.398 { 00:33:20.398 "nbd_device": "/dev/nbd1", 00:33:20.398 "bdev_name": "crypto_ram3" 00:33:20.398 } 00:33:20.398 ]' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:20.398 /dev/nbd1' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:20.398 /dev/nbd1' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:20.398 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:20.658 256+0 records in 00:33:20.658 256+0 records out 00:33:20.658 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114904 s, 91.3 MB/s 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:20.658 256+0 records in 00:33:20.658 256+0 records out 00:33:20.658 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0286623 s, 36.6 MB/s 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:20.658 256+0 records in 00:33:20.658 256+0 records out 00:33:20.658 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0450989 s, 23.3 MB/s 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:20.658 22:40:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:20.918 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:21.177 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:21.436 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:21.436 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:21.436 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:21.693 22:40:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:21.951 malloc_lvol_verify 00:33:21.951 22:40:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:21.951 14193431-bd15-44ec-ac97-81e852b3088d 00:33:22.210 22:40:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:22.470 191284a2-91be-45e8-80d0-0ce03fedf40b 00:33:22.729 22:40:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:22.729 /dev/nbd0 00:33:22.729 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:22.729 mke2fs 1.46.5 (30-Dec-2021) 00:33:22.988 Discarding device blocks: 0/4096 done 00:33:22.988 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:22.988 00:33:22.988 Allocating group tables: 0/1 done 00:33:22.988 Writing inode tables: 0/1 done 00:33:22.988 Creating journal (1024 blocks): done 00:33:22.988 Writing superblocks and filesystem accounting information: 0/1 done 00:33:22.988 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:22.988 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:23.247 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3611861 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3611861 ']' 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3611861 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3611861 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3611861' 00:33:23.248 killing process with pid 3611861 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3611861 00:33:23.248 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3611861 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:23.506 00:33:23.506 real 0m6.362s 00:33:23.506 user 0m9.166s 00:33:23.506 sys 0m2.451s 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:23.506 ************************************ 00:33:23.506 END TEST bdev_nbd 00:33:23.506 ************************************ 00:33:23.506 22:40:33 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:23.506 22:40:33 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:23.506 22:40:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:23.506 22:40:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:23.506 22:40:33 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:23.506 22:40:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:23.506 22:40:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:23.506 22:40:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:23.506 ************************************ 00:33:23.506 START TEST bdev_fio 00:33:23.506 ************************************ 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:23.506 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:23.506 ************************************ 00:33:23.506 START TEST bdev_fio_rw_verify 00:33:23.506 ************************************ 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:23.506 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:23.765 22:40:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:24.024 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:24.024 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:24.024 fio-3.35 00:33:24.024 Starting 2 threads 00:33:36.234 00:33:36.234 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3612971: Fri Jul 12 22:40:44 2024 00:33:36.234 read: IOPS=27.1k, BW=106MiB/s (111MB/s)(1058MiB/10001msec) 00:33:36.234 slat (usec): min=8, max=439, avg=16.40, stdev= 5.48 00:33:36.234 clat (usec): min=4, max=524, avg=119.23, stdev=63.26 00:33:36.234 lat (usec): min=14, max=601, avg=135.62, stdev=67.00 00:33:36.234 clat percentiles (usec): 00:33:36.234 | 50.000th=[ 109], 99.000th=[ 273], 99.900th=[ 289], 99.990th=[ 338], 00:33:36.234 | 99.999th=[ 502] 00:33:36.234 write: IOPS=32.5k, BW=127MiB/s (133MB/s)(1203MiB/9481msec); 0 zone resets 00:33:36.234 slat (usec): min=9, max=1673, avg=27.27, stdev= 8.74 00:33:36.234 clat (usec): min=9, max=2071, avg=157.79, stdev=93.77 00:33:36.234 lat (usec): min=33, max=2110, avg=185.06, stdev=99.55 00:33:36.234 clat percentiles (usec): 00:33:36.234 | 50.000th=[ 143], 99.000th=[ 396], 99.900th=[ 416], 99.990th=[ 429], 00:33:36.234 | 99.999th=[ 494] 00:33:36.234 bw ( KiB/s): min=119392, max=127064, per=94.96%, avg=123391.16, stdev=1258.59, samples=38 00:33:36.234 iops : min=29848, max=31766, avg=30847.79, stdev=314.65, samples=38 00:33:36.234 lat (usec) : 10=0.01%, 20=0.01%, 50=10.63%, 100=26.16%, 250=51.71% 00:33:36.234 lat (usec) : 500=11.49%, 750=0.01% 00:33:36.234 lat (msec) : 4=0.01% 00:33:36.234 cpu : usr=99.59%, sys=0.01%, ctx=23, majf=0, minf=477 00:33:36.234 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:36.234 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:36.234 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:36.234 issued rwts: total=270791,307979,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:36.234 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:36.235 00:33:36.235 Run status group 0 (all jobs): 00:33:36.235 READ: bw=106MiB/s (111MB/s), 106MiB/s-106MiB/s (111MB/s-111MB/s), io=1058MiB (1109MB), run=10001-10001msec 00:33:36.235 WRITE: bw=127MiB/s (133MB/s), 127MiB/s-127MiB/s (133MB/s-133MB/s), io=1203MiB (1261MB), run=9481-9481msec 00:33:36.235 00:33:36.235 real 0m11.068s 00:33:36.235 user 0m23.546s 00:33:36.235 sys 0m0.316s 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:36.235 ************************************ 00:33:36.235 END TEST bdev_fio_rw_verify 00:33:36.235 ************************************ 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3205a69-c59f-5b10-9816-1c144e112c4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3205a69-c59f-5b10-9816-1c144e112c4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "44ccc16b-ef44-5e7b-9c2c-2448c7826752"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "44ccc16b-ef44-5e7b-9c2c-2448c7826752",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:36.235 crypto_ram3 ]] 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "e3205a69-c59f-5b10-9816-1c144e112c4a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e3205a69-c59f-5b10-9816-1c144e112c4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "44ccc16b-ef44-5e7b-9c2c-2448c7826752"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "44ccc16b-ef44-5e7b-9c2c-2448c7826752",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:36.235 22:40:44 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:36.235 ************************************ 00:33:36.235 START TEST bdev_fio_trim 00:33:36.235 ************************************ 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:36.235 22:40:45 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.236 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:36.236 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:36.236 fio-3.35 00:33:36.236 Starting 2 threads 00:33:46.217 00:33:46.217 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3614486: Fri Jul 12 22:40:56 2024 00:33:46.217 write: IOPS=25.5k, BW=99.7MiB/s (105MB/s)(997MiB/10001msec); 0 zone resets 00:33:46.217 slat (usec): min=18, max=730, avg=34.41, stdev= 9.15 00:33:46.217 clat (usec): min=48, max=2278, avg=257.78, stdev=65.44 00:33:46.217 lat (usec): min=94, max=2301, avg=292.20, stdev=62.08 00:33:46.217 clat percentiles (usec): 00:33:46.217 | 50.000th=[ 269], 99.000th=[ 363], 99.900th=[ 388], 99.990th=[ 603], 00:33:46.217 | 99.999th=[ 1057] 00:33:46.217 bw ( KiB/s): min=101080, max=102464, per=100.00%, avg=102166.32, stdev=173.72, samples=38 00:33:46.217 iops : min=25270, max=25616, avg=25541.58, stdev=43.43, samples=38 00:33:46.217 trim: IOPS=25.5k, BW=99.7MiB/s (105MB/s)(997MiB/10001msec); 0 zone resets 00:33:46.217 slat (usec): min=7, max=318, avg=15.30, stdev= 4.81 00:33:46.217 clat (usec): min=45, max=1100, avg=172.06, stdev=90.83 00:33:46.217 lat (usec): min=53, max=1117, avg=187.35, stdev=93.62 00:33:46.217 clat percentiles (usec): 00:33:46.217 | 50.000th=[ 145], 99.000th=[ 375], 99.900th=[ 392], 99.990th=[ 408], 00:33:46.217 | 99.999th=[ 971] 00:33:46.217 bw ( KiB/s): min=101112, max=102464, per=100.00%, avg=102167.58, stdev=169.49, samples=38 00:33:46.217 iops : min=25278, max=25616, avg=25541.89, stdev=42.37, samples=38 00:33:46.217 lat (usec) : 50=0.01%, 100=12.16%, 250=45.73%, 500=42.09%, 750=0.01% 00:33:46.217 lat (usec) : 1000=0.01% 00:33:46.217 lat (msec) : 2=0.01%, 4=0.01% 00:33:46.217 cpu : usr=99.42%, sys=0.01%, ctx=34, majf=0, minf=257 00:33:46.217 IO depths : 1=5.2%, 2=14.0%, 4=64.6%, 8=16.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:46.217 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:46.217 complete : 0=0.0%, 4=86.1%, 8=13.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:46.217 issued rwts: total=0,255351,255352,0 short=0,0,0,0 dropped=0,0,0,0 00:33:46.217 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:46.217 00:33:46.217 Run status group 0 (all jobs): 00:33:46.217 WRITE: bw=99.7MiB/s (105MB/s), 99.7MiB/s-99.7MiB/s (105MB/s-105MB/s), io=997MiB (1046MB), run=10001-10001msec 00:33:46.217 TRIM: bw=99.7MiB/s (105MB/s), 99.7MiB/s-99.7MiB/s (105MB/s-105MB/s), io=997MiB (1046MB), run=10001-10001msec 00:33:46.217 00:33:46.217 real 0m11.148s 00:33:46.217 user 0m23.829s 00:33:46.217 sys 0m0.374s 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:46.217 ************************************ 00:33:46.217 END TEST bdev_fio_trim 00:33:46.217 ************************************ 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:46.217 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:46.217 00:33:46.217 real 0m22.526s 00:33:46.217 user 0m47.538s 00:33:46.217 sys 0m0.851s 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:46.217 ************************************ 00:33:46.217 END TEST bdev_fio 00:33:46.217 ************************************ 00:33:46.217 22:40:56 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:46.217 22:40:56 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:46.217 22:40:56 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:46.217 22:40:56 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:46.217 22:40:56 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:46.217 22:40:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:46.217 ************************************ 00:33:46.217 START TEST bdev_verify 00:33:46.217 ************************************ 00:33:46.217 22:40:56 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:46.217 [2024-07-12 22:40:56.355397] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:46.217 [2024-07-12 22:40:56.355441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3615900 ] 00:33:46.217 [2024-07-12 22:40:56.463836] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:46.477 [2024-07-12 22:40:56.561438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:46.477 [2024-07-12 22:40:56.561444] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.477 [2024-07-12 22:40:56.727803] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:46.477 [2024-07-12 22:40:56.727876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:46.477 [2024-07-12 22:40:56.727891] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:46.477 [2024-07-12 22:40:56.735826] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:46.477 [2024-07-12 22:40:56.735845] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:46.477 [2024-07-12 22:40:56.735857] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:46.477 [2024-07-12 22:40:56.743851] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:46.477 [2024-07-12 22:40:56.743868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:46.477 [2024-07-12 22:40:56.743880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:46.477 Running I/O for 5 seconds... 00:33:51.809 00:33:51.809 Latency(us) 00:33:51.809 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:51.809 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:51.809 Verification LBA range: start 0x0 length 0x800 00:33:51.809 crypto_ram : 5.02 5813.84 22.71 0.00 0.00 21924.95 1880.60 27240.18 00:33:51.809 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:51.809 Verification LBA range: start 0x800 length 0x800 00:33:51.809 crypto_ram : 5.02 5816.37 22.72 0.00 0.00 21919.21 1802.24 27126.21 00:33:51.809 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:51.809 Verification LBA range: start 0x0 length 0x800 00:33:51.809 crypto_ram3 : 5.02 2905.13 11.35 0.00 0.00 43807.54 8605.16 31001.38 00:33:51.809 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:51.809 Verification LBA range: start 0x800 length 0x800 00:33:51.809 crypto_ram3 : 5.02 2906.63 11.35 0.00 0.00 43780.41 8605.16 31001.38 00:33:51.809 =================================================================================================================== 00:33:51.809 Total : 17441.96 68.13 0.00 0.00 29212.71 1802.24 31001.38 00:33:51.809 00:33:51.809 real 0m5.754s 00:33:51.809 user 0m10.887s 00:33:51.809 sys 0m0.206s 00:33:51.809 22:41:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:51.809 22:41:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:51.809 ************************************ 00:33:51.809 END TEST bdev_verify 00:33:51.809 ************************************ 00:33:51.809 22:41:02 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:51.809 22:41:02 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:51.809 22:41:02 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:51.809 22:41:02 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.809 22:41:02 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:52.069 ************************************ 00:33:52.069 START TEST bdev_verify_big_io 00:33:52.069 ************************************ 00:33:52.069 22:41:02 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:52.069 [2024-07-12 22:41:02.184993] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:52.069 [2024-07-12 22:41:02.185036] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3616612 ] 00:33:52.069 [2024-07-12 22:41:02.294243] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:52.328 [2024-07-12 22:41:02.399015] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:52.328 [2024-07-12 22:41:02.399021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:52.328 [2024-07-12 22:41:02.565545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:52.328 [2024-07-12 22:41:02.565606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:52.328 [2024-07-12 22:41:02.565622] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.328 [2024-07-12 22:41:02.573565] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:52.328 [2024-07-12 22:41:02.573584] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:52.328 [2024-07-12 22:41:02.573596] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.328 [2024-07-12 22:41:02.581586] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:52.328 [2024-07-12 22:41:02.581603] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:52.328 [2024-07-12 22:41:02.581615] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.328 Running I/O for 5 seconds... 00:33:58.896 00:33:58.896 Latency(us) 00:33:58.896 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:58.896 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.896 Verification LBA range: start 0x0 length 0x80 00:33:58.896 crypto_ram : 5.13 424.56 26.54 0.00 0.00 294169.14 8377.21 408488.74 00:33:58.896 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.896 Verification LBA range: start 0x80 length 0x80 00:33:58.896 crypto_ram : 5.12 449.57 28.10 0.00 0.00 277907.78 8206.25 403017.91 00:33:58.896 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.896 Verification LBA range: start 0x0 length 0x80 00:33:58.896 crypto_ram3 : 5.33 240.32 15.02 0.00 0.00 499676.67 5784.26 408488.74 00:33:58.896 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.896 Verification LBA range: start 0x80 length 0x80 00:33:58.896 crypto_ram3 : 5.31 240.83 15.05 0.00 0.00 498664.63 6325.65 403017.91 00:33:58.896 =================================================================================================================== 00:33:58.896 Total : 1355.30 84.71 0.00 0.00 363393.24 5784.26 408488.74 00:33:58.896 00:33:58.896 real 0m6.047s 00:33:58.896 user 0m11.471s 00:33:58.896 sys 0m0.212s 00:33:58.896 22:41:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:58.896 22:41:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:58.896 ************************************ 00:33:58.896 END TEST bdev_verify_big_io 00:33:58.896 ************************************ 00:33:58.896 22:41:08 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:58.896 22:41:08 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:58.896 22:41:08 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:58.896 22:41:08 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:58.896 22:41:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:58.896 ************************************ 00:33:58.896 START TEST bdev_write_zeroes 00:33:58.896 ************************************ 00:33:58.896 22:41:08 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:58.896 [2024-07-12 22:41:08.337232] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:33:58.896 [2024-07-12 22:41:08.337293] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3617405 ] 00:33:58.896 [2024-07-12 22:41:08.464203] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.896 [2024-07-12 22:41:08.564503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:58.896 [2024-07-12 22:41:08.741143] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:58.896 [2024-07-12 22:41:08.741222] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:58.896 [2024-07-12 22:41:08.741238] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.896 [2024-07-12 22:41:08.749163] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:58.896 [2024-07-12 22:41:08.749183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:58.896 [2024-07-12 22:41:08.749195] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.896 [2024-07-12 22:41:08.757183] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:58.896 [2024-07-12 22:41:08.757201] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:58.896 [2024-07-12 22:41:08.757214] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:58.896 Running I/O for 1 seconds... 00:33:59.832 00:33:59.832 Latency(us) 00:33:59.832 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:59.832 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:59.832 crypto_ram : 1.01 26451.88 103.33 0.00 0.00 4827.60 1296.47 6610.59 00:33:59.832 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:59.832 crypto_ram3 : 1.01 13199.04 51.56 0.00 0.00 9625.51 6012.22 9915.88 00:33:59.832 =================================================================================================================== 00:33:59.832 Total : 39650.92 154.89 0.00 0.00 6426.90 1296.47 9915.88 00:33:59.832 00:33:59.832 real 0m1.752s 00:33:59.832 user 0m1.499s 00:33:59.832 sys 0m0.236s 00:33:59.832 22:41:10 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:59.832 22:41:10 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:59.832 ************************************ 00:33:59.832 END TEST bdev_write_zeroes 00:33:59.832 ************************************ 00:33:59.832 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:59.832 22:41:10 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:59.832 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:59.832 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:59.832 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:59.832 ************************************ 00:33:59.832 START TEST bdev_json_nonenclosed 00:33:59.832 ************************************ 00:33:59.832 22:41:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:00.091 [2024-07-12 22:41:10.173078] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:00.091 [2024-07-12 22:41:10.173140] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3617696 ] 00:34:00.091 [2024-07-12 22:41:10.298233] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:00.091 [2024-07-12 22:41:10.394614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:00.091 [2024-07-12 22:41:10.394686] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:00.091 [2024-07-12 22:41:10.394707] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:00.091 [2024-07-12 22:41:10.394720] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:00.350 00:34:00.350 real 0m0.385s 00:34:00.350 user 0m0.240s 00:34:00.350 sys 0m0.142s 00:34:00.350 22:41:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:00.350 22:41:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.350 22:41:10 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:00.350 ************************************ 00:34:00.350 END TEST bdev_json_nonenclosed 00:34:00.350 ************************************ 00:34:00.350 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:00.350 22:41:10 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:34:00.350 22:41:10 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:00.350 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:00.350 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:00.350 22:41:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:00.350 ************************************ 00:34:00.350 START TEST bdev_json_nonarray 00:34:00.350 ************************************ 00:34:00.350 22:41:10 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:00.350 [2024-07-12 22:41:10.640160] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:00.350 [2024-07-12 22:41:10.640220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3617723 ] 00:34:00.609 [2024-07-12 22:41:10.768151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:00.609 [2024-07-12 22:41:10.870375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:00.609 [2024-07-12 22:41:10.870450] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:00.609 [2024-07-12 22:41:10.870471] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:00.609 [2024-07-12 22:41:10.870483] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:00.868 00:34:00.868 real 0m0.398s 00:34:00.868 user 0m0.249s 00:34:00.868 sys 0m0.146s 00:34:00.868 22:41:10 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:00.868 22:41:10 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.868 22:41:10 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:00.868 ************************************ 00:34:00.869 END TEST bdev_json_nonarray 00:34:00.869 ************************************ 00:34:00.869 22:41:11 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:00.869 22:41:11 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:34:00.869 22:41:11 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:34:00.869 22:41:11 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:34:00.869 22:41:11 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:34:00.869 22:41:11 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:34:00.869 22:41:11 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:00.869 22:41:11 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:00.869 22:41:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:00.869 ************************************ 00:34:00.869 START TEST bdev_crypto_enomem 00:34:00.869 ************************************ 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=3617845 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 3617845 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 3617845 ']' 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:00.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:00.869 22:41:11 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:00.869 [2024-07-12 22:41:11.132181] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:00.869 [2024-07-12 22:41:11.132251] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3617845 ] 00:34:01.129 [2024-07-12 22:41:11.251429] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:01.129 [2024-07-12 22:41:11.347281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:02.067 true 00:34:02.067 base0 00:34:02.067 true 00:34:02.067 [2024-07-12 22:41:12.088216] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:02.067 crypt0 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.067 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:02.067 [ 00:34:02.067 { 00:34:02.067 "name": "crypt0", 00:34:02.067 "aliases": [ 00:34:02.067 "fb61add3-7a77-5e54-ba4e-291920f26fc4" 00:34:02.067 ], 00:34:02.067 "product_name": "crypto", 00:34:02.067 "block_size": 512, 00:34:02.067 "num_blocks": 2097152, 00:34:02.067 "uuid": "fb61add3-7a77-5e54-ba4e-291920f26fc4", 00:34:02.067 "assigned_rate_limits": { 00:34:02.067 "rw_ios_per_sec": 0, 00:34:02.067 "rw_mbytes_per_sec": 0, 00:34:02.067 "r_mbytes_per_sec": 0, 00:34:02.067 "w_mbytes_per_sec": 0 00:34:02.067 }, 00:34:02.067 "claimed": false, 00:34:02.067 "zoned": false, 00:34:02.067 "supported_io_types": { 00:34:02.067 "read": true, 00:34:02.067 "write": true, 00:34:02.067 "unmap": false, 00:34:02.067 "flush": false, 00:34:02.067 "reset": true, 00:34:02.067 "nvme_admin": false, 00:34:02.067 "nvme_io": false, 00:34:02.067 "nvme_io_md": false, 00:34:02.067 "write_zeroes": true, 00:34:02.067 "zcopy": false, 00:34:02.067 "get_zone_info": false, 00:34:02.067 "zone_management": false, 00:34:02.067 "zone_append": false, 00:34:02.067 "compare": false, 00:34:02.067 "compare_and_write": false, 00:34:02.067 "abort": false, 00:34:02.067 "seek_hole": false, 00:34:02.067 "seek_data": false, 00:34:02.067 "copy": false, 00:34:02.067 "nvme_iov_md": false 00:34:02.067 }, 00:34:02.067 "memory_domains": [ 00:34:02.067 { 00:34:02.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:02.068 "dma_device_type": 2 00:34:02.068 } 00:34:02.068 ], 00:34:02.068 "driver_specific": { 00:34:02.068 "crypto": { 00:34:02.068 "base_bdev_name": "EE_base0", 00:34:02.068 "name": "crypt0", 00:34:02.068 "key_name": "test_dek_sw" 00:34:02.068 } 00:34:02.068 } 00:34:02.068 } 00:34:02.068 ] 00:34:02.068 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.068 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:34:02.068 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=3617923 00:34:02.068 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:34:02.068 22:41:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:02.068 Running I/O for 5 seconds... 00:34:03.005 22:41:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:34:03.005 22:41:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.005 22:41:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:03.005 22:41:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.005 22:41:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 3617923 00:34:07.196 00:34:07.196 Latency(us) 00:34:07.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:07.196 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:07.196 crypt0 : 5.00 35959.71 140.47 0.00 0.00 886.07 418.50 1182.50 00:34:07.196 =================================================================================================================== 00:34:07.196 Total : 35959.71 140.47 0.00 0.00 886.07 418.50 1182.50 00:34:07.196 0 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 3617845 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 3617845 ']' 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 3617845 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3617845 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3617845' 00:34:07.196 killing process with pid 3617845 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 3617845 00:34:07.196 Received shutdown signal, test time was about 5.000000 seconds 00:34:07.196 00:34:07.196 Latency(us) 00:34:07.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:07.196 =================================================================================================================== 00:34:07.196 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 3617845 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:07.196 00:34:07.196 real 0m6.448s 00:34:07.196 user 0m6.731s 00:34:07.196 sys 0m0.358s 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:07.196 22:41:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:07.196 ************************************ 00:34:07.196 END TEST bdev_crypto_enomem 00:34:07.196 ************************************ 00:34:07.456 22:41:17 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:07.456 22:41:17 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:07.456 00:34:07.456 real 0m54.683s 00:34:07.456 user 1m34.119s 00:34:07.456 sys 0m6.456s 00:34:07.456 22:41:17 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:07.456 22:41:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:07.456 ************************************ 00:34:07.456 END TEST blockdev_crypto_sw 00:34:07.456 ************************************ 00:34:07.456 22:41:17 -- common/autotest_common.sh@1142 -- # return 0 00:34:07.456 22:41:17 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:07.456 22:41:17 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:07.456 22:41:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:07.456 22:41:17 -- common/autotest_common.sh@10 -- # set +x 00:34:07.456 ************************************ 00:34:07.456 START TEST blockdev_crypto_qat 00:34:07.456 ************************************ 00:34:07.456 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:07.456 * Looking for test storage... 00:34:07.456 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:07.456 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3618725 00:34:07.457 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:07.457 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 3618725 00:34:07.457 22:41:17 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 3618725 ']' 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:07.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:07.457 22:41:17 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:07.716 [2024-07-12 22:41:17.834993] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:07.716 [2024-07-12 22:41:17.835068] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3618725 ] 00:34:07.716 [2024-07-12 22:41:17.958306] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:07.976 [2024-07-12 22:41:18.063731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:08.545 22:41:18 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:08.545 22:41:18 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:08.545 22:41:18 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:08.545 22:41:18 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:08.545 22:41:18 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:08.545 22:41:18 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.545 22:41:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.545 [2024-07-12 22:41:18.737893] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:08.545 [2024-07-12 22:41:18.745940] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:08.545 [2024-07-12 22:41:18.753952] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:08.545 [2024-07-12 22:41:18.819335] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:11.083 true 00:34:11.083 true 00:34:11.083 true 00:34:11.083 true 00:34:11.083 Malloc0 00:34:11.083 Malloc1 00:34:11.083 Malloc2 00:34:11.083 Malloc3 00:34:11.083 [2024-07-12 22:41:21.200178] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:11.083 crypto_ram 00:34:11.083 [2024-07-12 22:41:21.208198] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:11.083 crypto_ram1 00:34:11.083 [2024-07-12 22:41:21.216220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:11.083 crypto_ram2 00:34:11.083 [2024-07-12 22:41:21.224256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:11.083 crypto_ram3 00:34:11.083 [ 00:34:11.083 { 00:34:11.083 "name": "Malloc1", 00:34:11.083 "aliases": [ 00:34:11.083 "5018eb22-2d8c-4f0d-8f0a-22eed227a2a6" 00:34:11.083 ], 00:34:11.083 "product_name": "Malloc disk", 00:34:11.083 "block_size": 512, 00:34:11.083 "num_blocks": 65536, 00:34:11.083 "uuid": "5018eb22-2d8c-4f0d-8f0a-22eed227a2a6", 00:34:11.083 "assigned_rate_limits": { 00:34:11.083 "rw_ios_per_sec": 0, 00:34:11.083 "rw_mbytes_per_sec": 0, 00:34:11.083 "r_mbytes_per_sec": 0, 00:34:11.083 "w_mbytes_per_sec": 0 00:34:11.083 }, 00:34:11.083 "claimed": true, 00:34:11.083 "claim_type": "exclusive_write", 00:34:11.083 "zoned": false, 00:34:11.083 "supported_io_types": { 00:34:11.083 "read": true, 00:34:11.083 "write": true, 00:34:11.083 "unmap": true, 00:34:11.083 "flush": true, 00:34:11.083 "reset": true, 00:34:11.083 "nvme_admin": false, 00:34:11.083 "nvme_io": false, 00:34:11.083 "nvme_io_md": false, 00:34:11.083 "write_zeroes": true, 00:34:11.083 "zcopy": true, 00:34:11.083 "get_zone_info": false, 00:34:11.083 "zone_management": false, 00:34:11.083 "zone_append": false, 00:34:11.083 "compare": false, 00:34:11.083 "compare_and_write": false, 00:34:11.083 "abort": true, 00:34:11.083 "seek_hole": false, 00:34:11.083 "seek_data": false, 00:34:11.083 "copy": true, 00:34:11.083 "nvme_iov_md": false 00:34:11.083 }, 00:34:11.083 "memory_domains": [ 00:34:11.083 { 00:34:11.083 "dma_device_id": "system", 00:34:11.083 "dma_device_type": 1 00:34:11.083 }, 00:34:11.083 { 00:34:11.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:11.083 "dma_device_type": 2 00:34:11.083 } 00:34:11.083 ], 00:34:11.083 "driver_specific": {} 00:34:11.083 } 00:34:11.083 ] 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.083 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.083 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:11.083 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.083 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.083 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:11.083 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.084 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.084 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:11.084 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:11.084 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:11.084 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:11.084 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "25af8156-8ab9-553f-9883-6fbd660c765c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25af8156-8ab9-553f-9883-6fbd660c765c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "905845de-edca-521b-8beb-764702aaf2b4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "905845de-edca-521b-8beb-764702aaf2b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "775aaf36-0b2c-52b2-b1fd-f6466b952111"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "775aaf36-0b2c-52b2-b1fd-f6466b952111",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da219839-97bc-5489-b4e0-1fca2c911ff4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da219839-97bc-5489-b4e0-1fca2c911ff4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:11.343 22:41:21 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 3618725 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 3618725 ']' 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 3618725 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3618725 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:11.343 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:11.344 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3618725' 00:34:11.344 killing process with pid 3618725 00:34:11.344 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 3618725 00:34:11.344 22:41:21 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 3618725 00:34:11.910 22:41:22 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:11.910 22:41:22 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:11.910 22:41:22 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:11.910 22:41:22 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:11.910 22:41:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.910 ************************************ 00:34:11.910 START TEST bdev_hello_world 00:34:11.910 ************************************ 00:34:11.910 22:41:22 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:11.911 [2024-07-12 22:41:22.149924] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:11.911 [2024-07-12 22:41:22.150000] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3619326 ] 00:34:12.168 [2024-07-12 22:41:22.276042] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:12.168 [2024-07-12 22:41:22.372988] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:12.168 [2024-07-12 22:41:22.394270] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:12.168 [2024-07-12 22:41:22.402299] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:12.169 [2024-07-12 22:41:22.410326] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:12.427 [2024-07-12 22:41:22.515771] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:15.004 [2024-07-12 22:41:24.719310] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:15.004 [2024-07-12 22:41:24.719383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:15.004 [2024-07-12 22:41:24.719398] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.004 [2024-07-12 22:41:24.727330] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:15.004 [2024-07-12 22:41:24.727349] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:15.004 [2024-07-12 22:41:24.727361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.004 [2024-07-12 22:41:24.735350] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:15.004 [2024-07-12 22:41:24.735367] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:15.004 [2024-07-12 22:41:24.735379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.004 [2024-07-12 22:41:24.743370] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:15.004 [2024-07-12 22:41:24.743388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:15.004 [2024-07-12 22:41:24.743399] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.004 [2024-07-12 22:41:24.818405] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:15.004 [2024-07-12 22:41:24.818445] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:15.004 [2024-07-12 22:41:24.818464] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:15.004 [2024-07-12 22:41:24.819771] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:15.004 [2024-07-12 22:41:24.819845] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:15.004 [2024-07-12 22:41:24.819863] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:15.004 [2024-07-12 22:41:24.819907] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:15.004 00:34:15.004 [2024-07-12 22:41:24.819934] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:15.004 00:34:15.004 real 0m3.145s 00:34:15.004 user 0m2.751s 00:34:15.004 sys 0m0.358s 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:15.004 ************************************ 00:34:15.004 END TEST bdev_hello_world 00:34:15.004 ************************************ 00:34:15.004 22:41:25 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:15.004 22:41:25 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:15.004 22:41:25 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:15.004 22:41:25 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:15.004 22:41:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:15.004 ************************************ 00:34:15.004 START TEST bdev_bounds 00:34:15.004 ************************************ 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3619759 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3619759' 00:34:15.004 Process bdevio pid: 3619759 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3619759 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 3619759 ']' 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:15.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:15.004 22:41:25 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:15.262 [2024-07-12 22:41:25.379268] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:15.262 [2024-07-12 22:41:25.379335] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3619759 ] 00:34:15.262 [2024-07-12 22:41:25.507079] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:15.521 [2024-07-12 22:41:25.615468] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:15.521 [2024-07-12 22:41:25.615553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:15.521 [2024-07-12 22:41:25.615558] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:15.521 [2024-07-12 22:41:25.636896] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:15.521 [2024-07-12 22:41:25.644923] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:15.521 [2024-07-12 22:41:25.652943] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:15.521 [2024-07-12 22:41:25.773552] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:18.059 [2024-07-12 22:41:27.982247] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:18.059 [2024-07-12 22:41:27.982325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:18.059 [2024-07-12 22:41:27.982341] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.059 [2024-07-12 22:41:27.990264] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:18.059 [2024-07-12 22:41:27.990283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:18.059 [2024-07-12 22:41:27.990297] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.059 [2024-07-12 22:41:27.998282] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:18.059 [2024-07-12 22:41:27.998301] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:18.059 [2024-07-12 22:41:27.998313] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.059 [2024-07-12 22:41:28.006306] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:18.059 [2024-07-12 22:41:28.006324] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:18.059 [2024-07-12 22:41:28.006336] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:18.059 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:18.059 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:18.059 22:41:28 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:18.059 I/O targets: 00:34:18.059 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:18.059 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:18.059 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:18.059 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:18.059 00:34:18.059 00:34:18.059 CUnit - A unit testing framework for C - Version 2.1-3 00:34:18.059 http://cunit.sourceforge.net/ 00:34:18.059 00:34:18.059 00:34:18.059 Suite: bdevio tests on: crypto_ram3 00:34:18.059 Test: blockdev write read block ...passed 00:34:18.059 Test: blockdev write zeroes read block ...passed 00:34:18.059 Test: blockdev write zeroes read no split ...passed 00:34:18.059 Test: blockdev write zeroes read split ...passed 00:34:18.059 Test: blockdev write zeroes read split partial ...passed 00:34:18.059 Test: blockdev reset ...passed 00:34:18.059 Test: blockdev write read 8 blocks ...passed 00:34:18.059 Test: blockdev write read size > 128k ...passed 00:34:18.059 Test: blockdev write read invalid size ...passed 00:34:18.059 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:18.059 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:18.059 Test: blockdev write read max offset ...passed 00:34:18.059 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:18.059 Test: blockdev writev readv 8 blocks ...passed 00:34:18.059 Test: blockdev writev readv 30 x 1block ...passed 00:34:18.059 Test: blockdev writev readv block ...passed 00:34:18.059 Test: blockdev writev readv size > 128k ...passed 00:34:18.059 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:18.059 Test: blockdev comparev and writev ...passed 00:34:18.059 Test: blockdev nvme passthru rw ...passed 00:34:18.059 Test: blockdev nvme passthru vendor specific ...passed 00:34:18.059 Test: blockdev nvme admin passthru ...passed 00:34:18.059 Test: blockdev copy ...passed 00:34:18.059 Suite: bdevio tests on: crypto_ram2 00:34:18.059 Test: blockdev write read block ...passed 00:34:18.059 Test: blockdev write zeroes read block ...passed 00:34:18.059 Test: blockdev write zeroes read no split ...passed 00:34:18.059 Test: blockdev write zeroes read split ...passed 00:34:18.059 Test: blockdev write zeroes read split partial ...passed 00:34:18.059 Test: blockdev reset ...passed 00:34:18.059 Test: blockdev write read 8 blocks ...passed 00:34:18.059 Test: blockdev write read size > 128k ...passed 00:34:18.059 Test: blockdev write read invalid size ...passed 00:34:18.059 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:18.059 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:18.059 Test: blockdev write read max offset ...passed 00:34:18.059 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:18.059 Test: blockdev writev readv 8 blocks ...passed 00:34:18.059 Test: blockdev writev readv 30 x 1block ...passed 00:34:18.059 Test: blockdev writev readv block ...passed 00:34:18.059 Test: blockdev writev readv size > 128k ...passed 00:34:18.059 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:18.059 Test: blockdev comparev and writev ...passed 00:34:18.059 Test: blockdev nvme passthru rw ...passed 00:34:18.059 Test: blockdev nvme passthru vendor specific ...passed 00:34:18.059 Test: blockdev nvme admin passthru ...passed 00:34:18.059 Test: blockdev copy ...passed 00:34:18.059 Suite: bdevio tests on: crypto_ram1 00:34:18.059 Test: blockdev write read block ...passed 00:34:18.059 Test: blockdev write zeroes read block ...passed 00:34:18.059 Test: blockdev write zeroes read no split ...passed 00:34:18.059 Test: blockdev write zeroes read split ...passed 00:34:18.059 Test: blockdev write zeroes read split partial ...passed 00:34:18.059 Test: blockdev reset ...passed 00:34:18.059 Test: blockdev write read 8 blocks ...passed 00:34:18.059 Test: blockdev write read size > 128k ...passed 00:34:18.059 Test: blockdev write read invalid size ...passed 00:34:18.059 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:18.059 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:18.059 Test: blockdev write read max offset ...passed 00:34:18.059 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:18.059 Test: blockdev writev readv 8 blocks ...passed 00:34:18.059 Test: blockdev writev readv 30 x 1block ...passed 00:34:18.059 Test: blockdev writev readv block ...passed 00:34:18.059 Test: blockdev writev readv size > 128k ...passed 00:34:18.059 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:18.059 Test: blockdev comparev and writev ...passed 00:34:18.059 Test: blockdev nvme passthru rw ...passed 00:34:18.059 Test: blockdev nvme passthru vendor specific ...passed 00:34:18.059 Test: blockdev nvme admin passthru ...passed 00:34:18.059 Test: blockdev copy ...passed 00:34:18.059 Suite: bdevio tests on: crypto_ram 00:34:18.059 Test: blockdev write read block ...passed 00:34:18.059 Test: blockdev write zeroes read block ...passed 00:34:18.059 Test: blockdev write zeroes read no split ...passed 00:34:18.318 Test: blockdev write zeroes read split ...passed 00:34:18.319 Test: blockdev write zeroes read split partial ...passed 00:34:18.319 Test: blockdev reset ...passed 00:34:18.319 Test: blockdev write read 8 blocks ...passed 00:34:18.319 Test: blockdev write read size > 128k ...passed 00:34:18.319 Test: blockdev write read invalid size ...passed 00:34:18.319 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:18.319 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:18.319 Test: blockdev write read max offset ...passed 00:34:18.319 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:18.319 Test: blockdev writev readv 8 blocks ...passed 00:34:18.319 Test: blockdev writev readv 30 x 1block ...passed 00:34:18.319 Test: blockdev writev readv block ...passed 00:34:18.319 Test: blockdev writev readv size > 128k ...passed 00:34:18.319 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:18.319 Test: blockdev comparev and writev ...passed 00:34:18.319 Test: blockdev nvme passthru rw ...passed 00:34:18.319 Test: blockdev nvme passthru vendor specific ...passed 00:34:18.319 Test: blockdev nvme admin passthru ...passed 00:34:18.319 Test: blockdev copy ...passed 00:34:18.319 00:34:18.319 Run Summary: Type Total Ran Passed Failed Inactive 00:34:18.319 suites 4 4 n/a 0 0 00:34:18.319 tests 92 92 92 0 0 00:34:18.319 asserts 520 520 520 0 n/a 00:34:18.319 00:34:18.319 Elapsed time = 0.514 seconds 00:34:18.319 0 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3619759 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 3619759 ']' 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 3619759 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3619759 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3619759' 00:34:18.319 killing process with pid 3619759 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 3619759 00:34:18.319 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 3619759 00:34:18.887 22:41:28 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:18.887 00:34:18.887 real 0m3.640s 00:34:18.887 user 0m10.121s 00:34:18.887 sys 0m0.581s 00:34:18.887 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:18.887 22:41:28 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:18.887 ************************************ 00:34:18.887 END TEST bdev_bounds 00:34:18.887 ************************************ 00:34:18.887 22:41:29 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:18.887 22:41:29 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:18.887 22:41:29 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:18.887 22:41:29 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:18.887 22:41:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:18.887 ************************************ 00:34:18.887 START TEST bdev_nbd 00:34:18.888 ************************************ 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3620259 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3620259 /var/tmp/spdk-nbd.sock 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 3620259 ']' 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:18.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:18.888 22:41:29 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:18.888 [2024-07-12 22:41:29.095257] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:18.888 [2024-07-12 22:41:29.095320] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:19.147 [2024-07-12 22:41:29.227340] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:19.147 [2024-07-12 22:41:29.330605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:19.147 [2024-07-12 22:41:29.351887] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:19.147 [2024-07-12 22:41:29.359909] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:19.147 [2024-07-12 22:41:29.367931] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:19.147 [2024-07-12 22:41:29.466272] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:21.683 [2024-07-12 22:41:31.676673] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:21.683 [2024-07-12 22:41:31.676740] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:21.683 [2024-07-12 22:41:31.676755] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.683 [2024-07-12 22:41:31.684692] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:21.683 [2024-07-12 22:41:31.684712] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:21.683 [2024-07-12 22:41:31.684724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.683 [2024-07-12 22:41:31.692712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:21.683 [2024-07-12 22:41:31.692730] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:21.683 [2024-07-12 22:41:31.692741] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.683 [2024-07-12 22:41:31.700734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:21.683 [2024-07-12 22:41:31.700752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:21.683 [2024-07-12 22:41:31.700764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:21.683 22:41:31 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:21.943 1+0 records in 00:34:21.943 1+0 records out 00:34:21.943 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303878 s, 13.5 MB/s 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:21.943 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:22.202 1+0 records in 00:34:22.202 1+0 records out 00:34:22.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320947 s, 12.8 MB/s 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:22.202 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:22.462 1+0 records in 00:34:22.462 1+0 records out 00:34:22.462 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324361 s, 12.6 MB/s 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:22.462 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:22.720 1+0 records in 00:34:22.720 1+0 records out 00:34:22.720 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354297 s, 11.6 MB/s 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:22.720 22:41:32 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:22.979 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd0", 00:34:22.980 "bdev_name": "crypto_ram" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd1", 00:34:22.980 "bdev_name": "crypto_ram1" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd2", 00:34:22.980 "bdev_name": "crypto_ram2" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd3", 00:34:22.980 "bdev_name": "crypto_ram3" 00:34:22.980 } 00:34:22.980 ]' 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd0", 00:34:22.980 "bdev_name": "crypto_ram" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd1", 00:34:22.980 "bdev_name": "crypto_ram1" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd2", 00:34:22.980 "bdev_name": "crypto_ram2" 00:34:22.980 }, 00:34:22.980 { 00:34:22.980 "nbd_device": "/dev/nbd3", 00:34:22.980 "bdev_name": "crypto_ram3" 00:34:22.980 } 00:34:22.980 ]' 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:22.980 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:23.239 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:23.498 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:23.757 22:41:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:24.017 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:24.276 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:24.277 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:24.536 /dev/nbd0 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:24.536 1+0 records in 00:34:24.536 1+0 records out 00:34:24.536 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265642 s, 15.4 MB/s 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:24.536 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:24.796 /dev/nbd1 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:24.796 1+0 records in 00:34:24.796 1+0 records out 00:34:24.796 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282961 s, 14.5 MB/s 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:24.796 22:41:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:25.055 /dev/nbd10 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:25.055 1+0 records in 00:34:25.055 1+0 records out 00:34:25.055 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351581 s, 11.7 MB/s 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:25.055 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:25.315 /dev/nbd11 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:25.315 1+0 records in 00:34:25.315 1+0 records out 00:34:25.315 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032824 s, 12.5 MB/s 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:25.315 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd0", 00:34:25.574 "bdev_name": "crypto_ram" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd1", 00:34:25.574 "bdev_name": "crypto_ram1" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd10", 00:34:25.574 "bdev_name": "crypto_ram2" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd11", 00:34:25.574 "bdev_name": "crypto_ram3" 00:34:25.574 } 00:34:25.574 ]' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd0", 00:34:25.574 "bdev_name": "crypto_ram" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd1", 00:34:25.574 "bdev_name": "crypto_ram1" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd10", 00:34:25.574 "bdev_name": "crypto_ram2" 00:34:25.574 }, 00:34:25.574 { 00:34:25.574 "nbd_device": "/dev/nbd11", 00:34:25.574 "bdev_name": "crypto_ram3" 00:34:25.574 } 00:34:25.574 ]' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:25.574 /dev/nbd1 00:34:25.574 /dev/nbd10 00:34:25.574 /dev/nbd11' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:25.574 /dev/nbd1 00:34:25.574 /dev/nbd10 00:34:25.574 /dev/nbd11' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:25.574 256+0 records in 00:34:25.574 256+0 records out 00:34:25.574 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114097 s, 91.9 MB/s 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:25.574 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:25.833 256+0 records in 00:34:25.833 256+0 records out 00:34:25.833 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0830797 s, 12.6 MB/s 00:34:25.833 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:25.834 256+0 records in 00:34:25.834 256+0 records out 00:34:25.834 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0457883 s, 22.9 MB/s 00:34:25.834 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:25.834 256+0 records in 00:34:25.834 256+0 records out 00:34:25.834 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0572311 s, 18.3 MB/s 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:25.834 256+0 records in 00:34:25.834 256+0 records out 00:34:25.834 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0332962 s, 31.5 MB/s 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:25.834 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.093 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:26.352 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:26.352 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.352 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:26.352 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:26.352 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:26.611 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:26.870 22:41:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:27.152 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:27.410 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:27.670 malloc_lvol_verify 00:34:27.670 22:41:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:27.929 289c2dc2-48e6-4c9a-a1e9-387920450564 00:34:27.929 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:27.929 c7e245e0-6dd3-42a4-95b8-e4f070ee01a0 00:34:28.188 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:28.188 /dev/nbd0 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:28.448 mke2fs 1.46.5 (30-Dec-2021) 00:34:28.448 Discarding device blocks: 0/4096 done 00:34:28.448 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:28.448 00:34:28.448 Allocating group tables: 0/1 done 00:34:28.448 Writing inode tables: 0/1 done 00:34:28.448 Creating journal (1024 blocks): done 00:34:28.448 Writing superblocks and filesystem accounting information: 0/1 done 00:34:28.448 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:28.448 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3620259 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 3620259 ']' 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 3620259 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3620259 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3620259' 00:34:28.708 killing process with pid 3620259 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 3620259 00:34:28.708 22:41:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 3620259 00:34:28.967 22:41:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:28.967 00:34:28.967 real 0m10.264s 00:34:28.967 user 0m13.458s 00:34:28.967 sys 0m4.079s 00:34:28.967 22:41:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:28.967 22:41:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:28.967 ************************************ 00:34:28.967 END TEST bdev_nbd 00:34:28.967 ************************************ 00:34:29.228 22:41:39 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:29.228 22:41:39 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:29.228 22:41:39 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:34:29.228 22:41:39 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:34:29.228 22:41:39 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:29.228 22:41:39 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:29.228 22:41:39 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:29.228 22:41:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:29.228 ************************************ 00:34:29.228 START TEST bdev_fio 00:34:29.228 ************************************ 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:29.228 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:29.228 ************************************ 00:34:29.228 START TEST bdev_fio_rw_verify 00:34:29.228 ************************************ 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:29.228 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:29.229 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:29.496 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:29.496 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:29.496 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:29.496 22:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:29.756 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.756 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.756 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.756 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:29.756 fio-3.35 00:34:29.756 Starting 4 threads 00:34:44.692 00:34:44.692 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3622199: Fri Jul 12 22:41:52 2024 00:34:44.692 read: IOPS=22.8k, BW=89.2MiB/s (93.5MB/s)(892MiB/10001msec) 00:34:44.692 slat (usec): min=11, max=1306, avg=61.22, stdev=29.90 00:34:44.692 clat (usec): min=20, max=1874, avg=333.84, stdev=199.55 00:34:44.692 lat (usec): min=52, max=1929, avg=395.06, stdev=213.64 00:34:44.692 clat percentiles (usec): 00:34:44.692 | 50.000th=[ 285], 99.000th=[ 930], 99.900th=[ 1172], 99.990th=[ 1287], 00:34:44.692 | 99.999th=[ 1663] 00:34:44.692 write: IOPS=25.2k, BW=98.3MiB/s (103MB/s)(957MiB/9734msec); 0 zone resets 00:34:44.692 slat (usec): min=18, max=365, avg=71.91, stdev=30.67 00:34:44.692 clat (usec): min=24, max=2639, avg=370.17, stdev=214.96 00:34:44.692 lat (usec): min=75, max=2982, avg=442.08, stdev=230.36 00:34:44.692 clat percentiles (usec): 00:34:44.692 | 50.000th=[ 326], 99.000th=[ 1057], 99.900th=[ 1319], 99.990th=[ 1434], 00:34:44.692 | 99.999th=[ 2008] 00:34:44.692 bw ( KiB/s): min=76552, max=137667, per=97.82%, avg=98485.74, stdev=3813.78, samples=76 00:34:44.692 iops : min=19138, max=34416, avg=24621.37, stdev=953.41, samples=76 00:34:44.692 lat (usec) : 50=0.01%, 100=3.53%, 250=34.56%, 500=39.88%, 750=16.61% 00:34:44.692 lat (usec) : 1000=4.40% 00:34:44.692 lat (msec) : 2=1.01%, 4=0.01% 00:34:44.692 cpu : usr=99.62%, sys=0.00%, ctx=102, majf=0, minf=278 00:34:44.692 IO depths : 1=4.2%, 2=27.4%, 4=54.7%, 8=13.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:44.692 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:44.692 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:44.692 issued rwts: total=228350,245006,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:44.692 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:44.692 00:34:44.692 Run status group 0 (all jobs): 00:34:44.692 READ: bw=89.2MiB/s (93.5MB/s), 89.2MiB/s-89.2MiB/s (93.5MB/s-93.5MB/s), io=892MiB (935MB), run=10001-10001msec 00:34:44.692 WRITE: bw=98.3MiB/s (103MB/s), 98.3MiB/s-98.3MiB/s (103MB/s-103MB/s), io=957MiB (1004MB), run=9734-9734msec 00:34:44.692 00:34:44.692 real 0m13.553s 00:34:44.692 user 0m45.628s 00:34:44.692 sys 0m0.508s 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:44.692 ************************************ 00:34:44.692 END TEST bdev_fio_rw_verify 00:34:44.692 ************************************ 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:44.692 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "25af8156-8ab9-553f-9883-6fbd660c765c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25af8156-8ab9-553f-9883-6fbd660c765c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "905845de-edca-521b-8beb-764702aaf2b4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "905845de-edca-521b-8beb-764702aaf2b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "775aaf36-0b2c-52b2-b1fd-f6466b952111"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "775aaf36-0b2c-52b2-b1fd-f6466b952111",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da219839-97bc-5489-b4e0-1fca2c911ff4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da219839-97bc-5489-b4e0-1fca2c911ff4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:44.693 crypto_ram1 00:34:44.693 crypto_ram2 00:34:44.693 crypto_ram3 ]] 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "25af8156-8ab9-553f-9883-6fbd660c765c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "25af8156-8ab9-553f-9883-6fbd660c765c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "905845de-edca-521b-8beb-764702aaf2b4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "905845de-edca-521b-8beb-764702aaf2b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "775aaf36-0b2c-52b2-b1fd-f6466b952111"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "775aaf36-0b2c-52b2-b1fd-f6466b952111",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da219839-97bc-5489-b4e0-1fca2c911ff4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da219839-97bc-5489-b4e0-1fca2c911ff4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:44.693 22:41:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:44.694 ************************************ 00:34:44.694 START TEST bdev_fio_trim 00:34:44.694 ************************************ 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:44.694 22:41:53 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:44.694 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:44.694 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:44.694 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:44.694 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:44.694 fio-3.35 00:34:44.694 Starting 4 threads 00:34:56.900 00:34:56.900 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3624051: Fri Jul 12 22:42:06 2024 00:34:56.900 write: IOPS=32.6k, BW=127MiB/s (134MB/s)(1273MiB/10001msec); 0 zone resets 00:34:56.900 slat (usec): min=17, max=435, avg=72.47, stdev=27.40 00:34:56.900 clat (usec): min=31, max=1635, avg=254.99, stdev=141.82 00:34:56.900 lat (usec): min=55, max=1767, avg=327.46, stdev=153.81 00:34:56.900 clat percentiles (usec): 00:34:56.900 | 50.000th=[ 233], 99.000th=[ 660], 99.900th=[ 742], 99.990th=[ 807], 00:34:56.900 | 99.999th=[ 1057] 00:34:56.900 bw ( KiB/s): min=111008, max=208960, per=100.00%, avg=130955.79, stdev=7916.05, samples=76 00:34:56.900 iops : min=27752, max=52240, avg=32738.95, stdev=1979.01, samples=76 00:34:56.900 trim: IOPS=32.6k, BW=127MiB/s (134MB/s)(1273MiB/10001msec); 0 zone resets 00:34:56.900 slat (usec): min=5, max=1202, avg=20.62, stdev=11.24 00:34:56.900 clat (usec): min=56, max=1767, avg=327.65, stdev=153.83 00:34:56.900 lat (usec): min=66, max=1801, avg=348.27, stdev=159.47 00:34:56.900 clat percentiles (usec): 00:34:56.900 | 50.000th=[ 306], 99.000th=[ 758], 99.900th=[ 848], 99.990th=[ 947], 00:34:56.900 | 99.999th=[ 1221] 00:34:56.900 bw ( KiB/s): min=111008, max=208960, per=100.00%, avg=130955.79, stdev=7916.05, samples=76 00:34:56.900 iops : min=27752, max=52240, avg=32738.95, stdev=1979.01, samples=76 00:34:56.900 lat (usec) : 50=0.06%, 100=6.86%, 250=38.81%, 500=43.89%, 750=9.77% 00:34:56.900 lat (usec) : 1000=0.60% 00:34:56.900 lat (msec) : 2=0.01% 00:34:56.900 cpu : usr=99.60%, sys=0.00%, ctx=42, majf=0, minf=119 00:34:56.900 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:56.900 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:56.900 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:56.900 issued rwts: total=0,325962,325962,0 short=0,0,0,0 dropped=0,0,0,0 00:34:56.900 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:56.900 00:34:56.900 Run status group 0 (all jobs): 00:34:56.900 WRITE: bw=127MiB/s (134MB/s), 127MiB/s-127MiB/s (134MB/s-134MB/s), io=1273MiB (1335MB), run=10001-10001msec 00:34:56.900 TRIM: bw=127MiB/s (134MB/s), 127MiB/s-127MiB/s (134MB/s-134MB/s), io=1273MiB (1335MB), run=10001-10001msec 00:34:56.900 00:34:56.900 real 0m13.478s 00:34:56.900 user 0m45.827s 00:34:56.900 sys 0m0.496s 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:56.900 ************************************ 00:34:56.900 END TEST bdev_fio_trim 00:34:56.900 ************************************ 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:56.900 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:56.900 00:34:56.900 real 0m27.397s 00:34:56.900 user 1m31.623s 00:34:56.900 sys 0m1.226s 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:56.900 ************************************ 00:34:56.900 END TEST bdev_fio 00:34:56.900 ************************************ 00:34:56.900 22:42:06 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:56.900 22:42:06 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:56.900 22:42:06 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:56.900 22:42:06 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:56.900 22:42:06 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:56.900 22:42:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:56.900 ************************************ 00:34:56.900 START TEST bdev_verify 00:34:56.900 ************************************ 00:34:56.900 22:42:06 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:56.900 [2024-07-12 22:42:06.914797] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:34:56.900 [2024-07-12 22:42:06.914856] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3625976 ] 00:34:56.900 [2024-07-12 22:42:07.041627] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:56.900 [2024-07-12 22:42:07.146593] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:56.900 [2024-07-12 22:42:07.146598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:56.900 [2024-07-12 22:42:07.167962] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:56.900 [2024-07-12 22:42:07.175987] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:56.900 [2024-07-12 22:42:07.184017] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:57.159 [2024-07-12 22:42:07.289125] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:59.693 [2024-07-12 22:42:09.486781] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:59.693 [2024-07-12 22:42:09.486872] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:59.693 [2024-07-12 22:42:09.486888] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:59.693 [2024-07-12 22:42:09.494798] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:59.693 [2024-07-12 22:42:09.494817] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:59.693 [2024-07-12 22:42:09.494830] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:59.693 [2024-07-12 22:42:09.502819] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:59.693 [2024-07-12 22:42:09.502837] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:59.693 [2024-07-12 22:42:09.502848] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:59.693 [2024-07-12 22:42:09.510841] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:59.693 [2024-07-12 22:42:09.510858] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:59.693 [2024-07-12 22:42:09.510869] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:59.693 Running I/O for 5 seconds... 00:35:04.963 00:35:04.963 Latency(us) 00:35:04.963 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:04.963 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x0 length 0x1000 00:35:04.963 crypto_ram : 5.07 494.28 1.93 0.00 0.00 257759.94 3148.58 164124.94 00:35:04.963 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x1000 length 0x1000 00:35:04.963 crypto_ram : 5.05 505.32 1.97 0.00 0.00 251573.82 4701.50 147712.45 00:35:04.963 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x0 length 0x1000 00:35:04.963 crypto_ram1 : 5.07 497.26 1.94 0.00 0.00 255734.57 3390.78 147712.45 00:35:04.963 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x1000 length 0x1000 00:35:04.963 crypto_ram1 : 5.05 506.73 1.98 0.00 0.00 250131.53 5242.88 147712.45 00:35:04.963 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x0 length 0x1000 00:35:04.963 crypto_ram2 : 5.05 3853.10 15.05 0.00 0.00 32922.52 6097.70 27240.18 00:35:04.963 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x1000 length 0x1000 00:35:04.963 crypto_ram2 : 5.02 3871.94 15.12 0.00 0.00 32969.82 6297.15 51972.90 00:35:04.963 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x0 length 0x1000 00:35:04.963 crypto_ram3 : 5.06 3861.30 15.08 0.00 0.00 32783.97 1453.19 26898.25 00:35:04.963 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:04.963 Verification LBA range: start 0x1000 length 0x1000 00:35:04.963 crypto_ram3 : 5.03 3870.71 15.12 0.00 0.00 32907.72 6382.64 45590.26 00:35:04.963 =================================================================================================================== 00:35:04.963 Total : 17460.63 68.21 0.00 0.00 58329.91 1453.19 164124.94 00:35:04.963 00:35:04.963 real 0m8.267s 00:35:04.963 user 0m15.668s 00:35:04.963 sys 0m0.380s 00:35:04.963 22:42:15 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:04.963 22:42:15 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:04.963 ************************************ 00:35:04.963 END TEST bdev_verify 00:35:04.963 ************************************ 00:35:04.963 22:42:15 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:04.963 22:42:15 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:04.963 22:42:15 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:04.963 22:42:15 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:04.963 22:42:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:04.963 ************************************ 00:35:04.963 START TEST bdev_verify_big_io 00:35:04.963 ************************************ 00:35:04.963 22:42:15 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:04.963 [2024-07-12 22:42:15.244361] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:04.963 [2024-07-12 22:42:15.244420] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3627034 ] 00:35:05.222 [2024-07-12 22:42:15.371931] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:05.222 [2024-07-12 22:42:15.475327] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:05.222 [2024-07-12 22:42:15.475332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:05.222 [2024-07-12 22:42:15.496704] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:05.222 [2024-07-12 22:42:15.504736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:05.222 [2024-07-12 22:42:15.512779] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:05.482 [2024-07-12 22:42:15.616685] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:08.018 [2024-07-12 22:42:17.825776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:08.018 [2024-07-12 22:42:17.825859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:08.018 [2024-07-12 22:42:17.825875] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:08.018 [2024-07-12 22:42:17.833793] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:08.018 [2024-07-12 22:42:17.833813] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:08.018 [2024-07-12 22:42:17.833825] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:08.018 [2024-07-12 22:42:17.841817] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:08.018 [2024-07-12 22:42:17.841836] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:08.018 [2024-07-12 22:42:17.841848] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:08.018 [2024-07-12 22:42:17.849840] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:08.018 [2024-07-12 22:42:17.849860] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:08.018 [2024-07-12 22:42:17.849872] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:08.018 Running I/O for 5 seconds... 00:35:08.587 [2024-07-12 22:42:18.729465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.729906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.730919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.731316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.731333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.731348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.731363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.734865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.734915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.734977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.735020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.735436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.735492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.587 [2024-07-12 22:42:18.735537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.735579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.736023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.736053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.736070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.736085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.739377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.739442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.739505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.739559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.739973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.740550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.743902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.743962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.744620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.745064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.745082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.745097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.745112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.748329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.748375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.748431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.748472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.748960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.749565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.752833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.752880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.752932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.752973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.753431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.753475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.753522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.753563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.754000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.754019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.754035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.754050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.757958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.758364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.758382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.758397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.758411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.761724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.761772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.761814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.761857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.762870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.766841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.767288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.767307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.767323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.767338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.770541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.770589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.770634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.770677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.588 [2024-07-12 22:42:18.771714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.771730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.774850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.774898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.774948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.774994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.775451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.775494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.775538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.775586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.776028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.776049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.776065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.776080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.779957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.780001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.780436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.780453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.780469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.780484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.783583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.783629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.783670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.783711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.784791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.787760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.787806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.787856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.787900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.788944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.791897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.791952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.791994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.792037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.792496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.792541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.792584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.792626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.793003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.793021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.793036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.793051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.796884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.797283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.797303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.797319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.797333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.800447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.800494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.800537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.800579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.801648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.804864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.804911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.804964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.805938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.809140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.809187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.589 [2024-07-12 22:42:18.809237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.809294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.809692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.809751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.809793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.809837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.810316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.810333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.810348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.810362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.813555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.813614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.813668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.813722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.814790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.817707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.817754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.817796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.817837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.818869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.821903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.821968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.822632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.823070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.823088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.823104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.823119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.825948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.826584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.827007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.827029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.827044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.827059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.829853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.829901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.829952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.829993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.830476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.830521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.830565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.830608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.831040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.831058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.831073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.831088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.833867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.833934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.833989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.834033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.834534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.834578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.834620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.834662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.835098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.835115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.835131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.835147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.837944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.837990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.838660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.839042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.839059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.839073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.839096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.841907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.841962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.842005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.842047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.842504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.842548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.590 [2024-07-12 22:42:18.842589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.842630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.843017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.843034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.843049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.843063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.845761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.845809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.845850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.845892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.846989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.849672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.849720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.849761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.849804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.850848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.853585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.853642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.853687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.853729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.854832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.856584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.856631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.856672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.856714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.857543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.859779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.859827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.859869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.859906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.860955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.863558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.865042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.866679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.868230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.868948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.869340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.869729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.870126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.870561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.870579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.870598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.870612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.873273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.874577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.876117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.877634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.878370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.878763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.591 [2024-07-12 22:42:18.879989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.883393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.884863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.886380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.887974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.888748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.889144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.889547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.889941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.890223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.890239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.890254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.890268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.893402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.894938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.896471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.897720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.898535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.898931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.899321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.899964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.900234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.900252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.900266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.900281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.903361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.904920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.906442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.906948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.907780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.908176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.592 [2024-07-12 22:42:18.908565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.909906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.910248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.910266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.910280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.910294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.913646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.915187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.852 [2024-07-12 22:42:18.916539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.916936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.917762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.918161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.918712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.920033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.920308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.920325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.920341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.920355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.923664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.925198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.925968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.926359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.927187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.927579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.928664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.929939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.930219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.930235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.930250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.930264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.933591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.935257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.935661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.936054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.936843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.937240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.938878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.940469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.940744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.940761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.940775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.940789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.944110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.945188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.945583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.945977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.946766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.947612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.948898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.950417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.950691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.950707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.950722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.950736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.954052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.954587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.954995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.955385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.956229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.957745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.959078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.960598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.960873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.960889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.960904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.960918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.964120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.964516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.964907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.965301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.966304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.967599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.969127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.970659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.970939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.970956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.970971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.970985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.973554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.973955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.974350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.974753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.976344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.977627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.979148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.980673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.981015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.981037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.981053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.981068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.983082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.983478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.983867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.984262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.986331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.987910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.989568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.991266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.991629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.991646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.991661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.991675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.853 [2024-07-12 22:42:18.993728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.994129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.994519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.994909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.996511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.998032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:18.999551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.000590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.000866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.000883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.000898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.000913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.003070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.003466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.003857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.004276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.005835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.007352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.008885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.009561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.009836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.009854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.009869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.009883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.012069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.012464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.012857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.013912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.015765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.017301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.018628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.019945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.020271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.020287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.020302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.020316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.022623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.023037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.023428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.025050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.026856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.028386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.029067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.030480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.030756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.030772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.030791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.030805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.033243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.033640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.034586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.035904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.037734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.039074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.040346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.041626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.041902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.041918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.041938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.041952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.044546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.044949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.046568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.048017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.049852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.050550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.052012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.053622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.053897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.053913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.053934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.053948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.056536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.057348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.058621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.060157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.061991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.063097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.064380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.065906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.066186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.066202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.066217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.066231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.068897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.070439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.071768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.073287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.074456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.076132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.077665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.079262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.079537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.079554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.079569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.079583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.082748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.084043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.854 [2024-07-12 22:42:19.085552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.087076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.088260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.089543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.091066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.092597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.092906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.092922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.092944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.092962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.096809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.098093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.099629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.101155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.103222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.104716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.106277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.107898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.108326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.108345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.108360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.108374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.112148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.113686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.115235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.116500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.118114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.119653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.121210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.122156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.122623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.122640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.122655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.122670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.126157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.127695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.129225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.129908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.131601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.133132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.134696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.135096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.135528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.135545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.135560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.135575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.139201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.140732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.142093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.143336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.145170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.146704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.147854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.148250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.148696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.148714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.148732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.148747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.152283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.153829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.154523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.155965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.157760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.159283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.159744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.160138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.160549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.160566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.160581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.160596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.164231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.165766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.166880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.168165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.169987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.171256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.171649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.172050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.172422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.172438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.172453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:08.855 [2024-07-12 22:42:19.172468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.175869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.177067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.178519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.179801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.181640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.182542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.182950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.183338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.183792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.183812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.183827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.183842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.186324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.187844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.189180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.190633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.191428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.191820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.192213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.192606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.192956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.192973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.192988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.193003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.195773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.196181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.196588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.196629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.197452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.197843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.198243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.198649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.199038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.199056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.199071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.199085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.201867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.202279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.202673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.203066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.203112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.203571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.203980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.204379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.204787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.205190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.205588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.205605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.205622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.205647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.207986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.208727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.209141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.209158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.209172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.209187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.211438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.211484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.211526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.121 [2024-07-12 22:42:19.211568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.212695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.214985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.215692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.216116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.216133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.216148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.216164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.218454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.218501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.218543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.218585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.218997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.219678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.221940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.221986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.222746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.223174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.223192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.223206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.223221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.225554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.225599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.225640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.225681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.226688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.228985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.229732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.230118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.230135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.230150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.230165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.232500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.232546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.232592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.232634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.233727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.236852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.122 [2024-07-12 22:42:19.237220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.237237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.237251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.237265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.239619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.239665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.239709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.239751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.240769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.243993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.244556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.247842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.248221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.248238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.248253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.248268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.250614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.250660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.250701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.250743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.251804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.254841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.255277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.255294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.255309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.255324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.257634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.257712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.257768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.257822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.258841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.261784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.262229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.262247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.262262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.262277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.264546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.264592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.264648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.264690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.265178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.265240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.123 [2024-07-12 22:42:19.265284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.265833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.268815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.269215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.269232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.269247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.269261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.271502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.271549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.271592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.271633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.272728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.274982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.275745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.276198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.276216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.276231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.276247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.278584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.278629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.278671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.278714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.279792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.282771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.283219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.283238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.283253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.283268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.285685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.285731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.285772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.285813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.286909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.289952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.290331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.290347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.290362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.290377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.292691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.124 [2024-07-12 22:42:19.292737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.292780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.292821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.293780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.295870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.295916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.295963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.296996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.297017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.297033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.297048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.298662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.298707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.298748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.298789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.299666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.125 [2024-07-12 22:42:19.301955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.301997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.302039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.302437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.302453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.302468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.302483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.304961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.305379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.305395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.305410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.305424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.306964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.307677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.308111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.308128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.308143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.308158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.310149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.310194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.311726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.311772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.312657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.314199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.314251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.314293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.314684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.315765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.319142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.320156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.321750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.323192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.323463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.325012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.325765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.326158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.326547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.327016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.327034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.327054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.327069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.330264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.330971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.332249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.333777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.334054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.335768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.336185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.126 [2024-07-12 22:42:19.336574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.336975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.337413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.337430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.337446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.337462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.340354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.341682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.342961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.344493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.344765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.345821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.346217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.346606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.347013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.347447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.347467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.347482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.347497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.349804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.351326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.353019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.354617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.354888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.355354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.355746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.356947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.359311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.360590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.362131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.363645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.363969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.364379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.364769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.365866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.368990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.370316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.371838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.373363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.373829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.374251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.374641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.375994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.378902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.380435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.381965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.383288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.383674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.384077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.384468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.384855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.386036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.386346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.386362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.386377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.386391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.389260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.390796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.392333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.393075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.393546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.393953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.394341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.394731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.396319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.396592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.396608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.396623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.396642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.399764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.401378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.403044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.403444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.403887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.404291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.404682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.405474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.406744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.407024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.407041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.407056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.407070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.410215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.411735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.412873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.127 [2024-07-12 22:42:19.413269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.413703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.414110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.414500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.415873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.417150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.417422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.417439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.417453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.417467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.420601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.422139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.422691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.423090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.423508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.423909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.424394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.425777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.427330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.427600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.427616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.427631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.427645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.430815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.432217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.432608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.433002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.433408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.433806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.434957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.436235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.437746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.438026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.438042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.438057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.128 [2024-07-12 22:42:19.438071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.441242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.441913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.442310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.442698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.443181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.443582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.445132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.446798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.448458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.448729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.448745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.448760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.448775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.452105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.452501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.452889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.453281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.453719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.454592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.455837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.457365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.458893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.459171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.459188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.459202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.459217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.461906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.392 [2024-07-12 22:42:19.462305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.462694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.463101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.463545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.464956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.466257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.467772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.469306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.469696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.469713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.469728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.469742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.471800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.472200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.472590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.472982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.473370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.474751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.476286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.477802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.479228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.479554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.479570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.479585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.479599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.481513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.481907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.482302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.482692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.482988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.484273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.485787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.487315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.488244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.488514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.488530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.488545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.488559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.490567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.490965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.491352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.491754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.492032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.493347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.494862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.496386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.497080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.497353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.497370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.497385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.497399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.499419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.499818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.500212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.501228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.501530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.503072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.504601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.505961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.507226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.507535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.507551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.507565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.507580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.509678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.510080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.510470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.512065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.512417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.513957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.515472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.516251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.517768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.518050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.518067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.518081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.518096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.520349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.520744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.521453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.522727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.523006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.524684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.526342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.527318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.528592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.528866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.528882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.528897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.528911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.531235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.531627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.532946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.534221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.534495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.536065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.537102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.538672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.393 [2024-07-12 22:42:19.540042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.540313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.540329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.540344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.540358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.543084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.543479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.544984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.546655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.546930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.548483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.549175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.550451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.552000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.552273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.552289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.552304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.552319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.554822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.555843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.557134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.558636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.558907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.560325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.561535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.562808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.564328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.564601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.564617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.564632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.564646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.567232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.568817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.570286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.571792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.572067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.572846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.574359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.576049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.577660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.577940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.577957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.577971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.577985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.581107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.582388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.583911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.585443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.585715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.586768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.588046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.589582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.591121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.591436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.591452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.591467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.591482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.595278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.596558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.598084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.599619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.600061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.601797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.603321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.604883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.606500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.606901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.606918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.606938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.606954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.610900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.612576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.614141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.614538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.614807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.616368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.617923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.618321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.618716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.619160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.619178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.619193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.619207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.621992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.622385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.622774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.623173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.623597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.624010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.624404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.624793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.625187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.625594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.625610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.625625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.625639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.628257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.628657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.629059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.394 [2024-07-12 22:42:19.629451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.629879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.630281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.630670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.631870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.634543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.634950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.634998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.635386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.635758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.636161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.636554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.636954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.637345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.637767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.637784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.637799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.637814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.640461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.640858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.641252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.641298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.641713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.642122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.642527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.642918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.643314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.643676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.643693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.643708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.643722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.646759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.647169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.647186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.647201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.647215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.649517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.649563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.649605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.649647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.650813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.653779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.654220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.654238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.654254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.654269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.656600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.656646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.656688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.656731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.657761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.660761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.661194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.395 [2024-07-12 22:42:19.661211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.661226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.661241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.663572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.663618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.663674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.663716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.664846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.667988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.668387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.668403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.668417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.668432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.670694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.670739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.670784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.670824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.671842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.674920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.675284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.675300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.675319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.675333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.677648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.677694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.677739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.677781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.678885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.681961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.682005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.682337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.682354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.682369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.682384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.684715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.684761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.684810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.684853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.396 [2024-07-12 22:42:19.685251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.685861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.688566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.688621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.688663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.688717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.689767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.692988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.693359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.693375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.693391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.693406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.695607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.695654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.695695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.695737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.696785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.699797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.700249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.700267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.700287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.700304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.702583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.702649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.702706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.702760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.703774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.706803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.707230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.707248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.707265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.707280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.709446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.709492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.709533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.709593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.710048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.710102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.710145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.710187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.397 [2024-07-12 22:42:19.710231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.398 [2024-07-12 22:42:19.710662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.398 [2024-07-12 22:42:19.710679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.398 [2024-07-12 22:42:19.710693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.398 [2024-07-12 22:42:19.710707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.713786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.714170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.714186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.714202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.714217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.716922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.717349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.717366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.717381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.717395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.661 [2024-07-12 22:42:19.719711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.719757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.719798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.719858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.720990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.721006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.721021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.723751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.724024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.724040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.724054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.724073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.725717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.725763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.725807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.725847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.726678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.729752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.730028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.730044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.730059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.730073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.731756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.731800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.731841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.731881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.732649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.734808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.734854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.734896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.734942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.735872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.737995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.738440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.740404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.740449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.740490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.740531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.740985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.741044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.662 [2024-07-12 22:42:19.741088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.741651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.743836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.744180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.744196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.744211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.744225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.745947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.745992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.746987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.747429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.747446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.747461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.747476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.749012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.749057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.749097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.749786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.750573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.752862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.753262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.754439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.755692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.755972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.757516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.758709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.760120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.761403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.761676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.761692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.761707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.761721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.764154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.764549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.766112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.767493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.767766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.769326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.770072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.771598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.773314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.773590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.773606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.773620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.773635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.776128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.777080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.778359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.779854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.780131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.781617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.782767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.784042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.785549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.785828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.785844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.785858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.785872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.788369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.789702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.790982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.792501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.792774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.793831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.795401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.796782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.798302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.798577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.798593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.798608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.798622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.801740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.803035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.663 [2024-07-12 22:42:19.804570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.806108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.806388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.807483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.808754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.810291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.811824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.812148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.812165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.812181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.812199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.815853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.817144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.818675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.820198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.820579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.822248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.823705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.825227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.826807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.827243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.827260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.827274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.827289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.830983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.832507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.834023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.835338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.835657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.836943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.838450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.839967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.841059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.841449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.841466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.841481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.841495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.845000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.846541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.848068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.848863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.849142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.850427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.851946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.853471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.854011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.854470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.854490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.854505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.854520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.858213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.859742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.861095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.862369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.862681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.864229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.865753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.866833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.867236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.867676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.867694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.867710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.867724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.871252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.872781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.873598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.875169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.875444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.877011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.878540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.879887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.883333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.884691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.885954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.887231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.887507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.889065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.890144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.890538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.890937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.891336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.891355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.891370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.891385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.894774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.895664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.897287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.898888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.899170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.900733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.901287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.901679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.902079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.902521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.902538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.664 [2024-07-12 22:42:19.902554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.902569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.905734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.906963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.908242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.909784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.910078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.911087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.911481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.911872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.912277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.912718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.912736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.912752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.912767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.915267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.916986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.918512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.920094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.920371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.921062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.921451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.921838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.922234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.922650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.922667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.922681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.922695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.925238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.926533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.928041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.929571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.929894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.930309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.930705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.931819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.934902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.936251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.937779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.939255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.939747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.940176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.940566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.940964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.941643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.941918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.941941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.941956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.941970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.944957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.946453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.948000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.949008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.949414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.949821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.950221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.950614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.951964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.952308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.952330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.952345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.952359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.955197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.956724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.958247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.958819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.959278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.959679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.960074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.960557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.961948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.962224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.962240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.962255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.962269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.965474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.967030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.968265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.968658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.665 [2024-07-12 22:42:19.969113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.969518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.969910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.970982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.972261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.972538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.972555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.972569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.972583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.975707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.977252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.978024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.978414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.978855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.979264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.979653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.666 [2024-07-12 22:42:19.981285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.982894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.983180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.983198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.983212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.983226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.986455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.987793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.988197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.988591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.988975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.989381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.990413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.991685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.993208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.993484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.993501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.993515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.993529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.996719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.997580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.997990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.998379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.998846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:19.999254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.000889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.002472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.004078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.004357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.004375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.004390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.004404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.007558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.007967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.008360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.008754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.009186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.010092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.011365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.012868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.014389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.014683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.014700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.014715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.014730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.017304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.017703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.018099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.018504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.018959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.020474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.021794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.023301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.024822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.025230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.025247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.025267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.025282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.027238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.027637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.028037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.028431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.028777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.030054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.031445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.032762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.034049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.034368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.034390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.034406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.034423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.036777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.037185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.038873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.040420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.040701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.042240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.042801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.044496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.045993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.046266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.046282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.046297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.046311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.049042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.049446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.049845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.050254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.050687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.051098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.051489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.051880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.052289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.052712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.052733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.052748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.052762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.055601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.056028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.056423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.056812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.057269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.057675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.058083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.058483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.058879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.059295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.059312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.059327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.059341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.062196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.062594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.062991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.063383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.063770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.064186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.064580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.064987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.065380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.065819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.928 [2024-07-12 22:42:20.065835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.065851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.065866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.068616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.069025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.069423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.069822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.070280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.070804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.071352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.071981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.072535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.073117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.073147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.073213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.073291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.076779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.077194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.077246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.077636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.078048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.078455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.078861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.079260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.079650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.080107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.080125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.080142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.080162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.082774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.083178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.083572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.083621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.083981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.084387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.084776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.085174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.085569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.085960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.085978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.085994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.086009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.088406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.088456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.088498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.088540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.088992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.089600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.092960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.093366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.093383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.093398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.093414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.095783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.095831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.095874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.095916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.096997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.097012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.099969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.100554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.102906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.102961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.103675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.104104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.104121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.104136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.104152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.106393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.106440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.106482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.106524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.106960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.107661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.110812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.111257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.111275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.111290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.111305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.113725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.113782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.113826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.113867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.114911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.117977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.118037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.929 [2024-07-12 22:42:20.118079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.118120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.118513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.118530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.118544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.118559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.120900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.120956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.120999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.121588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.122007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.122024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.122038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.122054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.124507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.124566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.124608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.124649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.124992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.125665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.128766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.129201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.129219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.129234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.129253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.131574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.131622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.131664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.131718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.132828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.135999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.136431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.136448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.136464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.136479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.138810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.138858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.138901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.138951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.139999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.140014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.141772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.141818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.141878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.141940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.142452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.142510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.142553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.142594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.142635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.143045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.143062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.143077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.143092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.145503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.145549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.145591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.145633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.146784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.148997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.149401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.151848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.152235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.152253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.152267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.930 [2024-07-12 22:42:20.152282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.154749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.155109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.155127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.155142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.155156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.156799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.156849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.156891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.156951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.157403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.157467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.157510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.157554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.157596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.158025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.158042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.158057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.158072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.160668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.161006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.161025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.161040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.161055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.162663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.162717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.162760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.162803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.163854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.165858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.165904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.165957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.166778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.168989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.169614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.171677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.171722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.171763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.171803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.172577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.174986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.175477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.175494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.175510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.175525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.177684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.177735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.177783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.177823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.178591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.180895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.181332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.181350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.181364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.181379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.183610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.183655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.183703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.183745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.184534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.186283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.186329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.187870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.187916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.931 [2024-07-12 22:42:20.188999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.189014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.191149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.191195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.191236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.192760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.193651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.195621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.196026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.196418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.196810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.197117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.198432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.199994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.201518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.202232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.202505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.202521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.202535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.202550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.204655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.205059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.205452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.206545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.206890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.208481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.210031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.211015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.212696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.212975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.212992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.213007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.213021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.215401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.215799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.216706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.218012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.218284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.219845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.221140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.222524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.223820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.224101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.224119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.224133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.224148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.226670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.227256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.228559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.230094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.230367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.232026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.233089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.234377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.235898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.236180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.236198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.236212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.236226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.238942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.240433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.242058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.243601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.243876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.244725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.246022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.247568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:09.932 [2024-07-12 22:42:20.249113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.249444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.249465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.249480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.249497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.253706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.255264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.256939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.258548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.258921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.260240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.261782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.263330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.264409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.264814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.264831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.264846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.264861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.268353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.269885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.271431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.272183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.272472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.274185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.275734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.277178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.277567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.278010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.278028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.278047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.278063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.281566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.283106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.283920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.285465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.285739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.287278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.288804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.289207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.289607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.290015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.290032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.290048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.290063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.293536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.294842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.296234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.297569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.297842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.299409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.300150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.300549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.300947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.301412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.301429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.301444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.301462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.304864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.193 [2024-07-12 22:42:20.305878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.307178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.308730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.309008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.310181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.310576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.310973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.311374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.311813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.311830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.311849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.311865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.314176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.315483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.317025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.318567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.318839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.319252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.319644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.320790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.324095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.325641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.327291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.328965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.329321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.329725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.330125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.330521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.331427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.331739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.331756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.331771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.331785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.334738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.336279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.337814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.338363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.338805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.339212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.339610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.340040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.341444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.341716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.341733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.341747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.341761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.344982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.346527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.347537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.347939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.348379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.348782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.349180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.350701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.352090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.352363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.352379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.352394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.352413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.355661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.357129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.357518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.357909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.358297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.358701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.359741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.361044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.362564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.362836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.362852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.362866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.362881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.366083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.366537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.366935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.367330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.367797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.368350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.369659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.371167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.372704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.372983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.373000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.373015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.373030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.375592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.376009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.376402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.376798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.377248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.378864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.380347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.381918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.383614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.384012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.384029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.384043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.384058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.386056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.386453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.386845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.387243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.387518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.388819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.390361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.391895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.392604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.392875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.392892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.392906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.392921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.395024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.395421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.395815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.396876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.397206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.194 [2024-07-12 22:42:20.398754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.400286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.401416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.402997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.403322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.403338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.403352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.403367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.405706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.406113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.406765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.408059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.408330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.410045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.411648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.412729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.414031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.414303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.414320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.414334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.414348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.416901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.417303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.418902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.420557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.420831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.422390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.423102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.424404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.425952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.426226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.426242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.426256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.426271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.428830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.429878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.431419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.432961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.433328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.434656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.436166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.437709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.438111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.438551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.438569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.438584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.438599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.442294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.443836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.445112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.446160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.446471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.448061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.449601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.450742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.451143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.451574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.451592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.451607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.451623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.454251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.454656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.455070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.455463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.455852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.456266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.456663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.457956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.460685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.461088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.461490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.461890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.462339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.462743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.463141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.463533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.464127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.464529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.464546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.464561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.464576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.467325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.467725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.468124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.468528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.468970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.469373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.469770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.470170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.470561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.471003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.471020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.471036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.471051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.473744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.474157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.474551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.474951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.475461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.475879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.476272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.476667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.477068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.477516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.477533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.477548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.477563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.480336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.480744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.481151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.481544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.481904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.482313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.482707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.483114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.483513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.483974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.483992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.484008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.195 [2024-07-12 22:42:20.484023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.486604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.487011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.487405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.487799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.488183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.488588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.488989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.489382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.489774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.490215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.490234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.490250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.490265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.492860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.493261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.493654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.494053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.494326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.495158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.495552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.496843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.497397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.497838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.497856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.497871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.497887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.500299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.501773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.501822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.502226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.502667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.503081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.503489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.505162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.505559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.506004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.506022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.506037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.506052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.508566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.510082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.510477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.510522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.510917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.512463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.512856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.513257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.513660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.514046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.514063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.514077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.196 [2024-07-12 22:42:20.514092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.516898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.517258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.517282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.517297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.517311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.519555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.519605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.519646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.519686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.519959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.520614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.522627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.522673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.522714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.522755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.523697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.526835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.527268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.457 [2024-07-12 22:42:20.527286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.527301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.527317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.529409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.529455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.529496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.529550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.529940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.530464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.532762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.532808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.532850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.532892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.533910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.535893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.535945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.535999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.536663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.537064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.537082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.537097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.537111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.539885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.540308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.540331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.540346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.540361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.542455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.542512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.542553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.542593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.543567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.545799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.545845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.545887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.545933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.546894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.549901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.550202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.550218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.550233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.550247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.552384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.552431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.552476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.552522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.552967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.553019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.458 [2024-07-12 22:42:20.553073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.553485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.555964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.556406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.558837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.558901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.558951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.558993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.559999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.560013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.560028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.562992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.563639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.565959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.566001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.566272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.566289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.566303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.566318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.570909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.571187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.571203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.571218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.571232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.574877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.574932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.574974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.575890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.578755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.578815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.578855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.578895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.579709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.584917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.459 [2024-07-12 22:42:20.584974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.585629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.586061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.586079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.586094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.586109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.590940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.590994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.591836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.595993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.596262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.596278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.596292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.596311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.600661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.600711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.600752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.600792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.601792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.605469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.605519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.605930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.605978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.606018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.606285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.606301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.606316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.674007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.674083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.675739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.679198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.679261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.679627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.680025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.680418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.680860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.680930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.682103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.682159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.683419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.683475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.684987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.685044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.686570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.686950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.686967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.686982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.686996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.688980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.689376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.689765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.690157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.690428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.691737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.693262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.694793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.695523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.695792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.695809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.695823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.695839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.697989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.698383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.698772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.700068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.700389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.701958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.460 [2024-07-12 22:42:20.703497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.704424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.706060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.706330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.706346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.706360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.706375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.708765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.709167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.710167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.711465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.711735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.713300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.714506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.715994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.717336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.717605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.717622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.717636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.717650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.720148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.720937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.722232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.723773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.724050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.725499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.726749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.728060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.729594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.729863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.729884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.729898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.729912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.732837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.734151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.735677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.737213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.737483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.738511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.739812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.741336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.742858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.743200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.743217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.743233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.743248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.747454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.749091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.750635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.752063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.752387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.753691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.755217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.756734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.757756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.758187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.758204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.758218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.758235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.761882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.763474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.765167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.766154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.766463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.768024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.769553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.770802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.771199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.771622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.771640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.771657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.771672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.775288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.776836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.777565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.778863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.461 [2024-07-12 22:42:20.779142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.780787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.782299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.782689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.783089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.783501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.783518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.783533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.783548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.786900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.787644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.789059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.790593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.790863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.792455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.792857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.793252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.793641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.794060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.794079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.794095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.794111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.796749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.798467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.800034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.801704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.801981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.802518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.802909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.803304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.803693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.804071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.804087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.804102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.804116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.723 [2024-07-12 22:42:20.807204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.808514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.810053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.811590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.811966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.812374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.812764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.813157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.813976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.814285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.814306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.814320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.814335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.817228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.818786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.820332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.821038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.821517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.821917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.822324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.822760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.824156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.824426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.824442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.824456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.824471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.827643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.829188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.830382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.830771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.831210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.831608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.832002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.833447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.834753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.835028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.835045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.835059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.835073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.838258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.839921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.840334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.840724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.841128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.841527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.842561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.843848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.845336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.845673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.845689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.845704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.845718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.848188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.848585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.850110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.850507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.850949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.852466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.853831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.855354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.856883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.857255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.857271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.857286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.857300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.859188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.859587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.859982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.860373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.860674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.861981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.863545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.865090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.865850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.866128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.866144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.866159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.866173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.868199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.868593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.868989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.869379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.869761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.870181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.870578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.870972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.871362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.871792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.871809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.871823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.871838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.874582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.874985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.875381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.875772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.876206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.876606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.876999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.877391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.724 [2024-07-12 22:42:20.877786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.878229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.878246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.878265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.878280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.881026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.881435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.881825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.882217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.882645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.883060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.883459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.883850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.884244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.884695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.884712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.884728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.884743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.887404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.887801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.888198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.888594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.889004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.889406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.889796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.890193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.890591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.890957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.890975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.890989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.891006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.893736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.894142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.894542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.894940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.895424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.895825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.896239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.896638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.897037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.897483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.897500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.897514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.897530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.900248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.900643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.901039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.901432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.901825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.902242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.902636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.903877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.906579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.906984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.907379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.907772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.908222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.908622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.909019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.909437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.909833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.910294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.910314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.910330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.910347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.913055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.913107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.913498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.913891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.914329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.914730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.915148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.915549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.915948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.916324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.916340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.916355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.916370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.918962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.919013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.919401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.919444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.919887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.920294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.920354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.920745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.920792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.921228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.921246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.921262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.921282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.923952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.924006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.924394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.924437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.725 [2024-07-12 22:42:20.924840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.925245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.925291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.925683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.925733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.926124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.926141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.926156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.926171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.929582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.929638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.930032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.930088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.930545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.930951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.931993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.934779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.934840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.935244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.935295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.935720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.936991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.939678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.939731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.940138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.940186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.940661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.941089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.941137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.941525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.941568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.942015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.942035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.942050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.942065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.944709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.944758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.945152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.945196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.945537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.945948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.945999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.946950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.950403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.950455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.950847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.950894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.951360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.951760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.951804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.952676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.955397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.955447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.955489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.955531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.955931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.956336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.956384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.956774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.956817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.957258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.957275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.957290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.957309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.959937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.960286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.726 [2024-07-12 22:42:20.960304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.960319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.960334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.961889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.961938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.961986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.962617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.963055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.963073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.963088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.963103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.965773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.966044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.966061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.966076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.966090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.967698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.967742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.967786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.967826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.968859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.970974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.971863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.973993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.974577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.976802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.976854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.976896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.976948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.727 [2024-07-12 22:42:20.977685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.977701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.977715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.977730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.979965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.980013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.980375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.980391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.980406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.980421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.982653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.982697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.982738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.982778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.983572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.985826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.986164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.986182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.986198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.986213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.988934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.988981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.989868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.991484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.991528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.991891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.991941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.991982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.992249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.992265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.729 [2024-07-12 22:42:20.992280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.133072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.133154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.133519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.133571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.133941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.134396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.136153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.137673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.138409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.139710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.141254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.141531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.141601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.143031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.143086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.143443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.143497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.143854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.144245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.144686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.144707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.144722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.144738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.147020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.148507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.150121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.151665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.151945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.152351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.152742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.153894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.156814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.158118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.159649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.161192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.161587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.162003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.162389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.162780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.991 [2024-07-12 22:42:21.163579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.163881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.163898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.163914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.163933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.166814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.168353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.169878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.170788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.171234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.171639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.172033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.172424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.173963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.174241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.174257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.174272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.174287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.177529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.179080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.180422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.180813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.181249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.181648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.182046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.183418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.184723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.185004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.185020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.185035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.185049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.188243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.189795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.190200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.190591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.190971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.191371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.192291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.193591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.195126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.195398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.195414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.195429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.195444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.198615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.199377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.199764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.200160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.200620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.201056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.202474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.204025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.205562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.205836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.205852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.205867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.205881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.208730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.209138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.209529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.209939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.210378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.211536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.213077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.214617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.215092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.215367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.215383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.215398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.215412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.217689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.218091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.219183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.220478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.220754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.222315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.223438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.224991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.226394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.226674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.226691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.226705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.226719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.229140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.229795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.231074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.232638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.232921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.234490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.235274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.236567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.238143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.238417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.238433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.238447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.238462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.241007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.241402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.241796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.242210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.992 [2024-07-12 22:42:21.242660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.243064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.243454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.243849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.244248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.244712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.244730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.244745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.244760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.247393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.247798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.248194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.248240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.248658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.249067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.249465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.249855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.250254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.250657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.250674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.250689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.250703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.253277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.253327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.253716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.253761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.254218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.254620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.254683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.255616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.258320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.258716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.258764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.259157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.259591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.259999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.260055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.260441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.260828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.261260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.261277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.261293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.261308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.263910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.263969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.264357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.264761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.265225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.265282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.265678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.266645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.268941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.269342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.269735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.269781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.270185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.270584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.270984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.271881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.274568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.274971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.275018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.275406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.275888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.276292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.276348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.276738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.277143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.277600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.277618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.277633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.277649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.280384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.280437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.280826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.281223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.281663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.281716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.282114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.282509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.282558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.282970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.282987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.283002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.993 [2024-07-12 22:42:21.283019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.285341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.285735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.286151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.286198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.286650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.287068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.287463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.287526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.287924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.288393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.288410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.288424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.288439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.291133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.291534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.291579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.291974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.292377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.292777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.292824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.293222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.293619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.294075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.294093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.294108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.294124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.296829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.296883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.297283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.297673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.298124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.298179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.298579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.298983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.299040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.299447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.299464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.299478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.299493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.301767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.302172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.302566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.302613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.303059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.303458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.303847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.303893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.304298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.304730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.304747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.304762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.304777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.307901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.308318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.308365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.308754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.309151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.309552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.309598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.309996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.310394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.310863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.310881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.310903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.310917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:10.994 [2024-07-12 22:42:21.313283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.313685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.313735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.314132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.314554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.314607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.315915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.318302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.319004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.261 [2024-07-12 22:42:21.319053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.320013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.320362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.320422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.321314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.321368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.321760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.322191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.322208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.322223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.322238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.324669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.325070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.325121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.325511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.325972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.326026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.326417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.326465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.326854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.327310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.327328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.327344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.327359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.329706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.330314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.330362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.331430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.331896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.331957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.332853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.332900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.333681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.334142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.334160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.334176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.334191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.335751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.336627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.336685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.338038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.338312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.338366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.340063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.340114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.341484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.341874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.341891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.341906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.341921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.344110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.345402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.345448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.346985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.347260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.347320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.348056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.348108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.349396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.349669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.349686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.349700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.349714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.351423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.351818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.351862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.353429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.353894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.353962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.354352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.354399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.355846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.356131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.356148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.356162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.356181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.357877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.359428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.359475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.361012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.361324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.361385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.361775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.361819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.362215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.362693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.362712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.362727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.362742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.364440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.366144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.366192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.367401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.367703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.367763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.369312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.369359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.370899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.371298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.371315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.262 [2024-07-12 22:42:21.371330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.371345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.373972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.374817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.374862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.374902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.375214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.375232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.375246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.375260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.376875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.376919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.376974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.377790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.380749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.381042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.381059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.381074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.381089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.382716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.382760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.382801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.382841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.383668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.385575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.385622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.385664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.385706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.386695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.388971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.389425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.391820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.392213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.392230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.392245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.392260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.263 [2024-07-12 22:42:21.394752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.394792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.394834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.395249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.395266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.395280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.395295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.396815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.396860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.396902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.396953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.397945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.400802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.401083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.401101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.401115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.401129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.402749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.402794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.402834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.402882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.403659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.406018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.406066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.406147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.407719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.407998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.408503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.410228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.411748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.411823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.412616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.412892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.412981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.413353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.413423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.413866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.414146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.414164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.414179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.414194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.416284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.417800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.419162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.419509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.421302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.421672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.422621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.424258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.427629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.427691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.427737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.427789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.429297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.429568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.429630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.429673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.429726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.431104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.431149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.431594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.433860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.433906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.433972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.435950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.264 [2024-07-12 22:42:21.437462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.437509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.437549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.437824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.437841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.439423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.439480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.440885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.441290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.441334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.441376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.441763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.442118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.442135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.444726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.444777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.444819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.446530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.448085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.448138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.448506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.448522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.450486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.450531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.450922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.450973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.451301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.451317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.451332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.451383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.452679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.452725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.452765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.453043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.453059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.454741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.456699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.457112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.457157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.457198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.457584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.457997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.458014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.461427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.461477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.461518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.462636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.464160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.464206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.464478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.464494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.466487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.466533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.467803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.467849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.468864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.469145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.469162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.472084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.472134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.473665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.473710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.473990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.474008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.474023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.475226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.475274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.265 [2024-07-12 22:42:21.475666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.475710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.476160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.476177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.480569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.480624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.481918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.481972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.482245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.482262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.482276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.483843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.483890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.484787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.484844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.485126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.485143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.487814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.487901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.489333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.489381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.489655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.489672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.489686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.491250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.491297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.492009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.492056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.492361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.492378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.496658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.496713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.498241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.498293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.498571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.498587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.498602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.500147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.500195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.501774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.501828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.502179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.502196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.504562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.504613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.505373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.505418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.505850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.505872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.505887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.507179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.507226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.507627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.507672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.508123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.508141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.514638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.514695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.516240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.516286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.516583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.516600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.516615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.517027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.517075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.517461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.517509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.517988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.518006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.521216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.521267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.522114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.522160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.522477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.522493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.522508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.524072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.524119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.525647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.525698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.526078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.526096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.533666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.533723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.535264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.535310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.535578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.535594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.535608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.266 [2024-07-12 22:42:21.537143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.537190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.538533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.538578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.538885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.538901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.541087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.541141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.541530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.541573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.541987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.542005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.542019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.543328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.543374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.544895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.544948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.545220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.545237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.551117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.551528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.551936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.553379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.553840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.553858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.553873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.554446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.554496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.555766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.557311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.557584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.557601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.560784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.561865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.562260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.562650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.563115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.563134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.563149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.563547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.565123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.566836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.568424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.568694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.568710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.573698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.574109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.575509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.575963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.576403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.576421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.576440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.267 [2024-07-12 22:42:21.578004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.579453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.581023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.582605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.583038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.583056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.584958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.585352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.585741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.586137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.586467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.586484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.586499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.587746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.589201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.590721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.591673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.591954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.591971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.596014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.597526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.597919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.598752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.599086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.599104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.599118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.600603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.602067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.603280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.604576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.604853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.604870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.607452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.609046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.610503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.612069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.612343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.612359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.612374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.613125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.614419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.615994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.617527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.617822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.617838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.624245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.625553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.627103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.628636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.629017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.629034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.629048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.630408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.631954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.633500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.634753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.635200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.635217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.638703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.640285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.641832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.642424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.642698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.642715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.642730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.644047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.645617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.647166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.647567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.647841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.647858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.651030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.651444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.651833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.652226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.652575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.652593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.652607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.653017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.587 [2024-07-12 22:42:21.654198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.654869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.655261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.655540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.655557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.658104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.658498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.658887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.659286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.659672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.659689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.659704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.660127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.661778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.662175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.662732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.663010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.663027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.666427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.666830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.667233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.668262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.668586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.668603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.668618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.669029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.670102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.670879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.671273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.671651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.671668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.674284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.674683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.675902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.676298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.677979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.678372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.678811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.678828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.682314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.682378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.683995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.684999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.685047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.685721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.686176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.686194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.688801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.688852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.689801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.690277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.690326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.690367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.690640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.690656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.693753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.693809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.694203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.694596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.695905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.697415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.697853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.697870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.700421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.700474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.700873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.701282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.701727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.701745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.701761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.702173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.702219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.702611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.703029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.703405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.703422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.708906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.708975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.709370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.588 [2024-07-12 22:42:21.709759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.710165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.710183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.710198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.710600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.710647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.711044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.711439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.711719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.711736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.714236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.714287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.714679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.715086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.715530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.715547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.715562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.715618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.716012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.716401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.716445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.716866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.716883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.719491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.719891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.720290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.720345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.720786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.720803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.720819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.721230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.721624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.721670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.722061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.722498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.722521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.724818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.726140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.726188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.726581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.727905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.728301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.728727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.728746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.732231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.732286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.733460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.733850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.734247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.734265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.734279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.734342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.734734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.735129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.735176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.735618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.735635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.737858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.739304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.739715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.739762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.740208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.740226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.740241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.741931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.742330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.742375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.742765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.743164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.743182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.746847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.748797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.749757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.749803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.750199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.751389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.751703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.751719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.753890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.754314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.754359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.754753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.755877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.756965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.757413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.589 [2024-07-12 22:42:21.757431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.760743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.761720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.761766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.762158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.762442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.762459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.762473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.762526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.763155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.763202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.763593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.763971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.763989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.766193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.767855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.767899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.768292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.768699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.768715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.768730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.768786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.770076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.770122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.771653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.771924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.771944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.776337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.778988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.780324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.780369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.780756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.781121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.781138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.782668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.783929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.783974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.785272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.785545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.785562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.785576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.785635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.787179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.787226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.787861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.788134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.788152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.790763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.792310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.792357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.793887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.794211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.794229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.794243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.794304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.795844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.795896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.797464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.797737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.797754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.799641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.800229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.800275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.801382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.801818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.801835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.801851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.801903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.802952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.802997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.804292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.804561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.804578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.809144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.809742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.809788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.811025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.811469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.811486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.811501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.811557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.812309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.812359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.813277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.813709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.813726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.815346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.816519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.816565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.818240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.818530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.818547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.818561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.818620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.590 [2024-07-12 22:42:21.820161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.820206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.821893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.822264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.822281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.826902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.826957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.826998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.827038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.827364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.827381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.827395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.827452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.828987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.829036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.829076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.829344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.829359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.831942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.834680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.834736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.834780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.834825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.835594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.837913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.838270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.838287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.842986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.843451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.845737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.846141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.591 [2024-07-12 22:42:21.846159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.852977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.853018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.853059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.853324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.853340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.854961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.855868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.860887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.861161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.861177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.862826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.862869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.862910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.862954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.863718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.866976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.867971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.869589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.869634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.869683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.871704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.872023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.872040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.877280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.878992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.879039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.880678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.880960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.880977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.880992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.881059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.881104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.882496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.882544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.882840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.882856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.884480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.885970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.886017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.886412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.886862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.886879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.886894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.888596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.592 [2024-07-12 22:42:21.888649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.889046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.889818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.890124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.890142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.894323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.895871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.895919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.895967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.896328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.896346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.896360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.896438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.897996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.898040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.898081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.898523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.898541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.900597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.901891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.901945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.901986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.902256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.902272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.902287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.902344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.903864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.903909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.903954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.904306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.593 [2024-07-12 22:42:21.904323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.909174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.909578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.909623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.909666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.910007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.910025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.910039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.910093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.910980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.911025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.911068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.911513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.911532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.913092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.913879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.913933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.913975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.914252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.914270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.914289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.915943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.915991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.916031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.917562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.917841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.917858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.923954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.924013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.924065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.925632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.925906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.925923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.925945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.926005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.926054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.927661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.927705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.928047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.928064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.929651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.929698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.931781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.932182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.932228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.932269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.855 [2024-07-12 22:42:21.932546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.932564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.936176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.937810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.939374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.939420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.939461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.941018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.941424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.941440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.943751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.943800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.943843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.944881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.945207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.945224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.945238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.945297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.945338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.946879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.946932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.947204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.947220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.951248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.951298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.952957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.954166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.954211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.954251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.954692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.954711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.959811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.959870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.961405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.961450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.961722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.961738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.961752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.963206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.963252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.964554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.964599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.965038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.965054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.969052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.969106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.970641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.970693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.971054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.971071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.971086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.972397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.972444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.973967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.974011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.974286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.974302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.977896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.977954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.979551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.979597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.979872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.979888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.979902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.981517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.981573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.983089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.983133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.983446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.983463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.987852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.987906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.989507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.989561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.990055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.990073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.990088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.990499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.990545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.991837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.991883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.992156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.992177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.997798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.997853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.999446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.999498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.999977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.856 [2024-07-12 22:42:21.999995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.000010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.000411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.000459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.001898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.001947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.002392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.002410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.007709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.007765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.009305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.009350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.009624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.009641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.009655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.010483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.010533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.012018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.012061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.012507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.012524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.017312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.017367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.018610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.018660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.018939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.018955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.018970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.020264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.020310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.021828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.021872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.022148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.022165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.026017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.026073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.027367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.027412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.027682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.027699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.027713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.029262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.029307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.030263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.030323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.030597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.030613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.034915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.034973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.036155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.036199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.036628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.036645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.036661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.037636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.037692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.038991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.039036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.039306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.039322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.044595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.046015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.046406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.047962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.049070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.050359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.050633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.050650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.055893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.057541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.057937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.058478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.058752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.058769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.058784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.059194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.059885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.061181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.062711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.062991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.063012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.068813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.069317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.069708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.071417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.071885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.071902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.071918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.072326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.073862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.857 [2024-07-12 22:42:22.075516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.077161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.077605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.077621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.082995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.083827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.084222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.085712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.086029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.086045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.086060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.087618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.089155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.089852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.091191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.091466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.091482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.097414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.097812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.098929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.100227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.100506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.100523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.100538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.102094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.103076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.104763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.106304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.106580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.106596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.110740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.111505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.112801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.114375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.114653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.114670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.114684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.116029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.117040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.118322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.119890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.120169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.120185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.123744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.124149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.124547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.126114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.126588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.126606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.126621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.127030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.128512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.128902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.129300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.129779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.129796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.133695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.134109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.135018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.135970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.136406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.136424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.136439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.137389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.138293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.138683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.139085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.139453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.139470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.144817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.145224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.146777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.147179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.147622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.147639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.147655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.149242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.149643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.150040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.150434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.150918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.150940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.156670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.157279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.158537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.158931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.159300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.159316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.159331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.160612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.161007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.161400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.161794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.162184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.162201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.858 [2024-07-12 22:42:22.166920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.167886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.168787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.169192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.169502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.169518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.169533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.170416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.170809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.171205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.171600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.171882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.171899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.175554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:11.859 [2024-07-12 22:42:22.176962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.177412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.177461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.177915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.177944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.177960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.179612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.180013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.180405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.180816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.181289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.181306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.187070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.187138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.187862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.187910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.188197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.188214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.188229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.188633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.189553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.189601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.190352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.190805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.190823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.194715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.194770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.195700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.196101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.196157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.196198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.196475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.196492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.199631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.199687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.200990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.201379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.201761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.201777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.201792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.203196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.203243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.203632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.204029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.204411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.125 [2024-07-12 22:42:22.204430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.208739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.208797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.209197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.210301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.210650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.210669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.210684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.211098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.211146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.212388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.212988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.213432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.213449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.217286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.217351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.217740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.218151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.218527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.218545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.218561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.219967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.220016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.220410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.220802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.221089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.221107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.224323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.224388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.225913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.226311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.226722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.226739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.226755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.226817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.227223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.227862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.227910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.228193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.228210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.230948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.231712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.232810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.232858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.233307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.233332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.233348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.234283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.235190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.235239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.235627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.235981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.236000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.238906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.239315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.239366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.239758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.240047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.240065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.240081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.240732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.240783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.241175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.242795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.243261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.243279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.249309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.249372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.249766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.250166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.250575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.250592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.250608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.250671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.251206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.252518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.252568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.253003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.253022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.256184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.257568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.258647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.258696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.259121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.259139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.259154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.259561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.260786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.260836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.261276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.261722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.261739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.265219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.265618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.265667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.266068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.266441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.266458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.266472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.126 [2024-07-12 22:42:22.268009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.268062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.268454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.268846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.269131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.269148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.272156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.272839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.272887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.273905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.274867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.275267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.275567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.275585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.278556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.279983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.280032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.281741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.282020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.282037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.282051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.282107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.283034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.283080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.284375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.284653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.284669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.288658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.289950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.290009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.291668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.291715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.293244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.293521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.293537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.297947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.298382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.298428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.298816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.299627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.300448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.300785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.300802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.305109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.306661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.306709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.307355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.307628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.307645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.307659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.307719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.308118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.308163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.308860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.309148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.309168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.312797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.314115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.314163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.315490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.315761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.315778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.315792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.315851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.317402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.317448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.318165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.318444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.318462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.321062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.322605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.322651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.324190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.324510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.324527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.324541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.324601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.326293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.326344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.328033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.328307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.328323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.127 [2024-07-12 22:42:22.332213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.332614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.332666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.333951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.334284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.334300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.334315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.334373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.335907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.335957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.337489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.338002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.338018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.342713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.343118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.343164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.344357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.344727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.344744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.344758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.344818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.345214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.345258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.346933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.347234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.347250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.351774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.351829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.351870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.351911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.352203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.352220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.352234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.352297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.353687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.353731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.353777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.354251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.354268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.356821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.356877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.356923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.356969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.357790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.362662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.362710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.362751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.362792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.363747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.367963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.368005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.368045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.368085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.368422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.368438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.373767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.373822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.373863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.373903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.374850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.128 [2024-07-12 22:42:22.379924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.379970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.380241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.380257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.383575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.383626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.383668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.383710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.384682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.389866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.390208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.390225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.393823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.393872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.393921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.393968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.394751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.399946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.400383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.400404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.404859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.404908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.404954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.404995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.405818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.409765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.409816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.409861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.410825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.411262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.129 [2024-07-12 22:42:22.411280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.415583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.416928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.416975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.418576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.418856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.418872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.418886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.418954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.419004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.419955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.420001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.420340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.420357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.422911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.424459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.424505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.426041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.426431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.426447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.426462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.427841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.427887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.429424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.430967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.431245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.431262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.434030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.435952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.436008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.436906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.436956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.436997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.437311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.437327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.441321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.441803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.441849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.441890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.442212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.442229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.442244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.442303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.443838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.443883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.443924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.444203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.130 [2024-07-12 22:42:22.444220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.448524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.448935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.448984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.449987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.450035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.450308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.450324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.454350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.455862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.455909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.455954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.456792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.457184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.457647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.457664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.462762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.462817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.462858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.464798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.465404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.465449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.465902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.465920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.469494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.469544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.470639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.470683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.470966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.470983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.470998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.471051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.472493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.472538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.472597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.472871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.472887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.477163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.478851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.480416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.480473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.480513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.481426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.481743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.481759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.486086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.486141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.486187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.487651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.487933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.487949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.487964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.488020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.488078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.489621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.489667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.489943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.489961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.494800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.494852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.495809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.496718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.496765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.392 [2024-07-12 22:42:22.496806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.497133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.497149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.503459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.503514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.503909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.503957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.504406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.504424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.504439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.504838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.504881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.505275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.505319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.505634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.505651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.510956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.511012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.512532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.512577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.512997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.513910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.514346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.514364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.520046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.520100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.521629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.521682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.521960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.521976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.521991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.522396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.522441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.522829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.522872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.523271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.523288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.528351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.528412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.529997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.530051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.530321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.530342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.530356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.532951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.537474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.537530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.539100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.539151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.539420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.539437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.539451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.541069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.541138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.542637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.542682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.543105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.543123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.547847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.547903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.549510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.549554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.549824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.549840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.549854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.551330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.551378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.551771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.551819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.552248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.552266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.556943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.556997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.558677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.558729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.559007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.559024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.559039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.560589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.560636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.562289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.562334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.562737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.562754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.567831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.567886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.568718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.568764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.569041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.393 [2024-07-12 22:42:22.569058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.569073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.570571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.570620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.572253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.572299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.572567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.572584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.576887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.576955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.578481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.578526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.578939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.578956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.578971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.580687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.580736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.582401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.582448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.582717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.582734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.588240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.589822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.591370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.592098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.592371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.592388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.592403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.593710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.593759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.595320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.596854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.597344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.597362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.600825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.601230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.601622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.602023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.602372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.602390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.602409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.602814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.603211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.603602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.604001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.604398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.604414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.608004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.608404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.608794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.609190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.609595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.609612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.609627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.610051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.610454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.610843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.611239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.611663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.611683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.615192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.615594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.615993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.616383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.616855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.616874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.616889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.617295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.617694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.618097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.618495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.618924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.618945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.622341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.622746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.623148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.623541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.623976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.623996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.624011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.624408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.624799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.625196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.625592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.626003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.626020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.629522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.629942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.630336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.630733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.631213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.631231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.631247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.631651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.632049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.632443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.632834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.633272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.633290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.394 [2024-07-12 22:42:22.636662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.637068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.637466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.637856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.638230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.638249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.638264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.638669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.639068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.639461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.639867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.640310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.640328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.643129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.643531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.643939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.644337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.644794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.644811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.644826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.645235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.645627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.646028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.646428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.646837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.646854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.649551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.649957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.650348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.650737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.651185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.651204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.651225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.651625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.652029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.652431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.652820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.653324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.653345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.656011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.656405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.656798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.657222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.657635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.657652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.657666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.658077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.658469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.658860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.659257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.659654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.659671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.662523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.662922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.663326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.663715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.664170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.664188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.664204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.664601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.665003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.665401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.665793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.666247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.666266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.669046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.669444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.669832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.669880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.670304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.670321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.670336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.671744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.672513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.673581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.673984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.674427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.674444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.677132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.677184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.677574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.677618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.678089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.678108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.678128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.678533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.678943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.679010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.679405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.679849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.679867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.682507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.682555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.682956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.683002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.683425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.683443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.683458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.395 [2024-07-12 22:42:22.683510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.683900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.683970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.684013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.684461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.684478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.687525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.687578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.688879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.690430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.690701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.690719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.690734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.691956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.692005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.693683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.695219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.695496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.695513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.698015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.698066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.699035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.700335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.700612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.700628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.700643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.702213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.702261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.703175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.704804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.705088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.705105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.707375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.707425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.707813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.708836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.709163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.709179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.709194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.710748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.710796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.712334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.713224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.713500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.396 [2024-07-12 22:42:22.713517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.715546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.715597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.715994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.716385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.716766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.716783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.716797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.716852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.718147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.719688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.719735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.720017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.720039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.721707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.723252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.723699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.723743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.724199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.724218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.724234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.724633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.725031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.725076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.726301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.726615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.726632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.729539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.731084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.731132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.732698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.733158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.733176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.733191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.733595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.733641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.734039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.734429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.734755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.734773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.737663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.737714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.739099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.740634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.740909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.740931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.740946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.741007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.741401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.741790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.741835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.742242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.742259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.744190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.745738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.746735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.746793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.747076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.747093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.747107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.748506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.750052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.750107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.751697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.752146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.752163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.755719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.757253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.757300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.758824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.759199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.759217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.759233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.760880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.760947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.762485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.764024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.764300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.764317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.766708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.767839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.767886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.769193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.769466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.769483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.769497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.769556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.771104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.771151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.771851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.772144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.659 [2024-07-12 22:42:22.772162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.773879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.774278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.774324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.774713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.775148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.775168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.775184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.775234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.776644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.776690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.778136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.778410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.778430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.780117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.781764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.781811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.783174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.783556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.783573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.783587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.783639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.784036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.784081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.784465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.784908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.784932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.786485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.787238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.787285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.788581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.788854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.788871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.788886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.788952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.790497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.790543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.791647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.792076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.792103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.794299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.795614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.795661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.797237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.797515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.797531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.797545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.797603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.798419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.798466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.799756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.800037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.800054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.802045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.802439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.802485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.802873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.803158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.803175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.803190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.803253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.804827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.804881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.806428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.806703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.806719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.808385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.809596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.809643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.810979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.811024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.811654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.811935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.811952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.813611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.814916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.814968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.816504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.816775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.816791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.816806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.816863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.817759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.817807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.818200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.818650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.818667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.820759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.822465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.822513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.823896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.824221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.824238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.824252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.824305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.825604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.825652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.827194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.827466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.827482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.829777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.829824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.829866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.829908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.830286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.830302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.830316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.830369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.831666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.831713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.831754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.832028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.832045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.833653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.833699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.833739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.833779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.834564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.836899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.836954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.836998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.837815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.839986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.840449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.842801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.842848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.842891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.842940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.843823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.845981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.846023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.846063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.846103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.846372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.846389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.848637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.848685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.848727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.848769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.849755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.851998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.852042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.852313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.852329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.854989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.855472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.660 [2024-07-12 22:42:22.857754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.857795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.858072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.858089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.860940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.861291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.861308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.862863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.862908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.862960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.864603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.864903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.864920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.864942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.864999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.865041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.865089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.865141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.865410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.865427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.867533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.867934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.867979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.869777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.871324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.871372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.871642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.871659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.873364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.874961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.875007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.875395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.875849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.875866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.875882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.876286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.876332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.876722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.878337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.878649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.878666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.880322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.882494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.883791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.883837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.883878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.884332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.884349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.886536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.887852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.887900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.887950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.888230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.888247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.888261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.888314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.890018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.890067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.890109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.890436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.890453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.892065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.892459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.892504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.892547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.892994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.893989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.895524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.896622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.896677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.896718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.897026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.897044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.897058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.898616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.898663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.898704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.900246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.900615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.900632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.904476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.904528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.904568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.906515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.907555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.907601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.907876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.907893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.909548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.909594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.909989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.910983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.911283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.911300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.912935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.914913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.916487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.916542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.916586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.917967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.918342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.918359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.922168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.922219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.922265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.923808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.924175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.924193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.924208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.924266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.924320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.925924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.925993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.926264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.926280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.661 [2024-07-12 22:42:22.928399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.928445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.928834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.928878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.929205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.929222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.929236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.929291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.930582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.930628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.930669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.930942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.930960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.934130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.934182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.935847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.935902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.936322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.936342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.936357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.936763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.936809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.937207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.937253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.937687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.937704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.940243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.940311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.941823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.941875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.942151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.942168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.942183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.943855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.943910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.945282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.945328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.945738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.945755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.949298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.949349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.950871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.950916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.951194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.951211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.951225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.951974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.952023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.953316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.953362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.953635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.953651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.955901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.955957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.956347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.956390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.956659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.956675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.956690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.957997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.958044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.959608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.959653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.959922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.959946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.963191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.963244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.963644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.963691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.964155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.964175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.964190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.964589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.964634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.965032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.965076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.965417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.965434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.968956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.969369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.969417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.969805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.969846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.970299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.970317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.972975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.973843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.974259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.974310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.974698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.974742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.975188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.975209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.977902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.977963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.978353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.978395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.978854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.978872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.978888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.979303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.979361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.979760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.979814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.980276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.662 [2024-07-12 22:42:22.980293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.982920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.982981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.983369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.983411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.983850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.983868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.983883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.984291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.984336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.984722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.984791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.985212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.985230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.987893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.988302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.988699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.989100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.989517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.989535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.989550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.989954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.990003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.990399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.990795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.991196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.991218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.993896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.994304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.994696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.995091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.995524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.995542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.995560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.995970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.996367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.996759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.997156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.997588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:22.997605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.000225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.000627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.001026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.001423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.001785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.001803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.001818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.002246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.002637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.003034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.924 [2024-07-12 22:42:23.003425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.003843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.003861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.006571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.006994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.007406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.007797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.008226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.008244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.008259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.008657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.009053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.009447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.009839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.010279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.010297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.012878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.013279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.013668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.014061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.014439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.014457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.014472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.014877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.015294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.015683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.016077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.016506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.016523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.019334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.019729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.020142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.020539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.020985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.021004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.021020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.021421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.021815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.022216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.022611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.023027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.023045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.025696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.026099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.026489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.026880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.027327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.027345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.027361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.027761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.028168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.028561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.028959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.029411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.029429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.032037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.032447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.032839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.033259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.033632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.033650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.033665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.034076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.034467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.034856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.035330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.035602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.035619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.038044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.038437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.038830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.039232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.039659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.039676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.039691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.040096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.040488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.040876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.041280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.041648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.041665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.044416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.044821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.045221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.045610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.046059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.046077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.046092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.046493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.046890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.047294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.047687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.048089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.048106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.050767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.051169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.051561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.052585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.925 [2024-07-12 22:42:23.052899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.052916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.052937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.054493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.056004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.057095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.058662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.059013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.059030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.061190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.061600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.062080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.062128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.062401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.062418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.062433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.064092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.065628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.067066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.068305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.068615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.068631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.070730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.070778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.071177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.071227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.071654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.071671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.071686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.073282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.074733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.074789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.076431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.076704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.076721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.079922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.079977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.080943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.081332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.081375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.081417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.081844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.081861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.084095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.084145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.085440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.086978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.087253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.087269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.087283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.088808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.088853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.089251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.089642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.090029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.090046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.093449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.093501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.094379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.095978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.096249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.096265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.096280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.097825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.097873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.099566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.099971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.100423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.100440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.104082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.104142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.105673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.106707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.106986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.107004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.107018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.108313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.108360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.109889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.111438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.111812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.111829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.115977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.116027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.117595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.119133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.119408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.119428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.119443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.119505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.120696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.122000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.122045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.926 [2024-07-12 22:42:23.122316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.122332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.124461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.124872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.125337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.125383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.125656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.125673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.125688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.127404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.128986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.129033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.130346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.130688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.130704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.132641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.133958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.135354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.135403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.136808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.138356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.138629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.138646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.141916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.141976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.142373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.142768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.143200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.143217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.143232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.143284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.143675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.145172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.145220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.145518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.145535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.147174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.148839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.150472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.150517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.150791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.150807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.150823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.151234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.151627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.151674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.152068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.152505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.152523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.155354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.156783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.156832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.158288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.158561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.158577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.158592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.160157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.160204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.160596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.160992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.161373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.161389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.163327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.164861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.164907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.166045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.166321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.166337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.166352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.166405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.167694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.167742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.169265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.169538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.169554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.171831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.172297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.172342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.173638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.173909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.173931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.173951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.174011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.175560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.175605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.176717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.176998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.177015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.178654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.179996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.927 [2024-07-12 22:42:23.180051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.180694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.180738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.182042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.182317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.182333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.183950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.185487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.185533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.187997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.188392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.188852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.188872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.190580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.192133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.192186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.193197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.193502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.193519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.193533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.193592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.195139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.195186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.196725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.197072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.197091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.199567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.201248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.201300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.202990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.203260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.203276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.203291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.203343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.205024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.205069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.206251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.206559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.206575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.208307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.208710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.208760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.209174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.209612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.209630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.209646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.209700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.211401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.211453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.213128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.213400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.213416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.215074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.216609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.216655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.217796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.928 [2024-07-12 22:42:23.218200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.218217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.218233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.218286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.218675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.218718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.219126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.219558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.219575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.221120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.221848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.221893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.223201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.223472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.223491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.223510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.223569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.225129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.225176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.226148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.226535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.226552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.228703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.228748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.228789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.228831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.229121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.229139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.229153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.229209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.230751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.230798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.230856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.231129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.231145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.232818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.232862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.232903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.232950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.233527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.234013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.234033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.236753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.237021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.237038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.238695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.238739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.238779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.238820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.239716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.242104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.242149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.242193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.242234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.929 [2024-07-12 22:42:23.242551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.242761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.243033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.243049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.244683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.244727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.244775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.244821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:12.930 [2024-07-12 22:42:23.245617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.247864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.247911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.247958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.248783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.250989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.251029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.251069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.251334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.251350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.253604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.253649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.253691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.253733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.254676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.256910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.257187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.257204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.259247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.259292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.259332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.259376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.191 [2024-07-12 22:42:23.259815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.259833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.259848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.259896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.259944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.259988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.260034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.260308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.260324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.261958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.262003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.262048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.263825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.264098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.264115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.266378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.266773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.266817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.268845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.270477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.270524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.270796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.270812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.272471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.273726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.273772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.274161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.274590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.274608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.274623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.275029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.275081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.275469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.276940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.277215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.277231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.278910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.280899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.281985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.282030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.282086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.282531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.282549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.284771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.286522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.287255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.287303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.287344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.287695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.287720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.289670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.290695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.292406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.292456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.292503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.292778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.292794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.294503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.296443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.297610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.297657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.297713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.298114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.192 [2024-07-12 22:42:23.298561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.298583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.302152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.302204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.302245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.303831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.304316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.304333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.304348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.304408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.304449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.305733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.305778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.306060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.306076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.307966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.308985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.310489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.310539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.310580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.310854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.310871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.312544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.314494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.315435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.315502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.315556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.315948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.316372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.316389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.320005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.320059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.320110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.321558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.321894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.321910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.321930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.321985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.322027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.323311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.323355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.323629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.323646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.325817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.325878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.326820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.328117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.328163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.328204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.328475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.328492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.331707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.331759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.333300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.333345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.333701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.333718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.333734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.334147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.334198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.334586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.334631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.335038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.335056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.337775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.337825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.338218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.338262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.338673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.338690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.338705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.339122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.339173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.339564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.339610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.340059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.340079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.342654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.342704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.343101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.343145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.343599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.343619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.193 [2024-07-12 22:42:23.343634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.344921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.347837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.347891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.348295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.348342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.348791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.348810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.348826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.349247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.349293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.349682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.349726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.350076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.350094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.352700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.352750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.353979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.354027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.354414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.354459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.354942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.354960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.357576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.357633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.358979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.359031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.359437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.359485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.359935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.359954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.362587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.362640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.363968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.364361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.364409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.364760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.364777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.367897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.367976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.368370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.368423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.368869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.368886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.368902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.369311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.369361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.369749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.369791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.370205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.370223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.372884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.372942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.373334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.373381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.373772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.373789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.373804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.374216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.374266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.374649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.374692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.375099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.375117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.377827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.378228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.378621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.194 [2024-07-12 22:42:23.379021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.379383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.379404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.379419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.379829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.379881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.380282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.380674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.381107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.381125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.384715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.385130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.385524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.195 [2024-07-12 22:42:23.385917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:13.763 00:35:13.763 Latency(us) 00:35:13.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.763 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x0 length 0x100 00:35:13.763 crypto_ram : 6.11 41.93 2.62 0.00 0.00 2966846.78 286306.84 2640587.91 00:35:13.763 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x100 length 0x100 00:35:13.763 crypto_ram : 6.05 42.31 2.64 0.00 0.00 2935382.37 315484.61 2523876.84 00:35:13.763 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x0 length 0x100 00:35:13.763 crypto_ram1 : 6.11 41.92 2.62 0.00 0.00 2866056.68 284483.23 2450932.42 00:35:13.763 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x100 length 0x100 00:35:13.763 crypto_ram1 : 6.05 42.31 2.64 0.00 0.00 2836351.78 315484.61 2319632.47 00:35:13.763 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x0 length 0x100 00:35:13.763 crypto_ram2 : 5.60 258.22 16.14 0.00 0.00 442545.69 23592.96 652852.54 00:35:13.763 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x100 length 0x100 00:35:13.763 crypto_ram2 : 5.61 273.43 17.09 0.00 0.00 420191.77 86165.59 641910.87 00:35:13.763 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x0 length 0x100 00:35:13.763 crypto_ram3 : 5.72 269.04 16.81 0.00 0.00 413500.03 29177.77 344662.37 00:35:13.763 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:13.763 Verification LBA range: start 0x100 length 0x100 00:35:13.763 crypto_ram3 : 5.74 285.05 17.82 0.00 0.00 390790.05 53796.51 474138.71 00:35:13.763 =================================================================================================================== 00:35:13.763 Total : 1254.20 78.39 0.00 0.00 770487.00 23592.96 2640587.91 00:35:14.331 00:35:14.331 real 0m9.323s 00:35:14.331 user 0m17.725s 00:35:14.331 sys 0m0.440s 00:35:14.331 22:42:24 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.331 22:42:24 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:14.331 ************************************ 00:35:14.331 END TEST bdev_verify_big_io 00:35:14.331 ************************************ 00:35:14.331 22:42:24 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:14.331 22:42:24 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.331 22:42:24 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:14.331 22:42:24 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:14.331 22:42:24 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:14.331 ************************************ 00:35:14.331 START TEST bdev_write_zeroes 00:35:14.331 ************************************ 00:35:14.331 22:42:24 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.331 [2024-07-12 22:42:24.650434] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:14.331 [2024-07-12 22:42:24.650493] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3628267 ] 00:35:14.590 [2024-07-12 22:42:24.778279] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.590 [2024-07-12 22:42:24.878441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:14.590 [2024-07-12 22:42:24.899745] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:14.590 [2024-07-12 22:42:24.907773] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:14.849 [2024-07-12 22:42:24.915792] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:14.849 [2024-07-12 22:42:25.031982] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:17.382 [2024-07-12 22:42:27.248163] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:17.382 [2024-07-12 22:42:27.248231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:17.382 [2024-07-12 22:42:27.248246] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:17.382 [2024-07-12 22:42:27.256182] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:17.382 [2024-07-12 22:42:27.256201] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:17.382 [2024-07-12 22:42:27.256213] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:17.382 [2024-07-12 22:42:27.264220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:17.382 [2024-07-12 22:42:27.264238] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:17.382 [2024-07-12 22:42:27.264249] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:17.382 [2024-07-12 22:42:27.272222] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:17.382 [2024-07-12 22:42:27.272239] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:17.382 [2024-07-12 22:42:27.272251] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:17.382 Running I/O for 1 seconds... 00:35:18.319 00:35:18.319 Latency(us) 00:35:18.319 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:18.319 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:18.319 crypto_ram : 1.03 2012.04 7.86 0.00 0.00 63140.45 5584.81 76591.64 00:35:18.319 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:18.319 crypto_ram1 : 1.03 2017.56 7.88 0.00 0.00 62609.53 5584.81 71120.81 00:35:18.319 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:18.319 crypto_ram2 : 1.02 15491.93 60.52 0.00 0.00 8140.96 2450.48 10713.71 00:35:18.319 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:18.319 crypto_ram3 : 1.02 15523.95 60.64 0.00 0.00 8097.99 2436.23 8434.20 00:35:18.319 =================================================================================================================== 00:35:18.319 Total : 35045.48 136.90 0.00 0.00 14442.50 2436.23 76591.64 00:35:18.578 00:35:18.578 real 0m4.169s 00:35:18.578 user 0m3.747s 00:35:18.578 sys 0m0.377s 00:35:18.578 22:42:28 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:18.578 22:42:28 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:18.578 ************************************ 00:35:18.578 END TEST bdev_write_zeroes 00:35:18.578 ************************************ 00:35:18.578 22:42:28 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:18.578 22:42:28 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:18.578 22:42:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:18.578 22:42:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:18.578 22:42:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:18.578 ************************************ 00:35:18.578 START TEST bdev_json_nonenclosed 00:35:18.578 ************************************ 00:35:18.578 22:42:28 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:18.837 [2024-07-12 22:42:28.922873] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:18.837 [2024-07-12 22:42:28.922947] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3628807 ] 00:35:18.837 [2024-07-12 22:42:29.056067] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:19.096 [2024-07-12 22:42:29.163453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:19.096 [2024-07-12 22:42:29.163526] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:19.096 [2024-07-12 22:42:29.163547] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:19.096 [2024-07-12 22:42:29.163561] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:19.096 00:35:19.096 real 0m0.415s 00:35:19.096 user 0m0.247s 00:35:19.096 sys 0m0.164s 00:35:19.096 22:42:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:19.096 22:42:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:19.096 22:42:29 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:19.096 ************************************ 00:35:19.096 END TEST bdev_json_nonenclosed 00:35:19.096 ************************************ 00:35:19.096 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:19.096 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:19.096 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:19.096 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:19.096 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:19.096 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:19.096 ************************************ 00:35:19.096 START TEST bdev_json_nonarray 00:35:19.096 ************************************ 00:35:19.096 22:42:29 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:19.355 [2024-07-12 22:42:29.427312] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:19.356 [2024-07-12 22:42:29.427377] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3628835 ] 00:35:19.356 [2024-07-12 22:42:29.559705] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:19.356 [2024-07-12 22:42:29.663351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:19.356 [2024-07-12 22:42:29.663436] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:19.356 [2024-07-12 22:42:29.663459] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:19.356 [2024-07-12 22:42:29.663472] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:19.614 00:35:19.614 real 0m0.403s 00:35:19.614 user 0m0.240s 00:35:19.614 sys 0m0.160s 00:35:19.614 22:42:29 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:19.614 22:42:29 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:19.614 22:42:29 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:19.614 ************************************ 00:35:19.614 END TEST bdev_json_nonarray 00:35:19.614 ************************************ 00:35:19.614 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:19.614 22:42:29 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:19.614 00:35:19.614 real 1m12.179s 00:35:19.614 user 2m40.206s 00:35:19.614 sys 0m9.078s 00:35:19.614 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:19.614 22:42:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:19.614 ************************************ 00:35:19.614 END TEST blockdev_crypto_qat 00:35:19.614 ************************************ 00:35:19.614 22:42:29 -- common/autotest_common.sh@1142 -- # return 0 00:35:19.614 22:42:29 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:19.614 22:42:29 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:19.614 22:42:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:19.614 22:42:29 -- common/autotest_common.sh@10 -- # set +x 00:35:19.614 ************************************ 00:35:19.615 START TEST chaining 00:35:19.615 ************************************ 00:35:19.615 22:42:29 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:19.874 * Looking for test storage... 00:35:19.874 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:19.874 22:42:30 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:19.874 22:42:30 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:19.874 22:42:30 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:19.874 22:42:30 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.874 22:42:30 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.874 22:42:30 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.874 22:42:30 chaining -- paths/export.sh@5 -- # export PATH 00:35:19.874 22:42:30 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@47 -- # : 0 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:19.874 22:42:30 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:19.874 22:42:30 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:19.874 22:42:30 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:19.874 22:42:30 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:19.874 22:42:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@336 -- # return 1 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:26.440 WARNING: No supported devices were found, fallback requested for tcp test 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:26.440 22:42:36 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:26.441 Cannot find device "nvmf_tgt_br" 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@155 -- # true 00:35:26.441 22:42:36 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:26.699 Cannot find device "nvmf_tgt_br2" 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@156 -- # true 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:26.699 Cannot find device "nvmf_tgt_br" 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@158 -- # true 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:26.699 Cannot find device "nvmf_tgt_br2" 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@159 -- # true 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:26.699 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@162 -- # true 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:26.699 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@163 -- # true 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:26.699 22:42:36 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:26.699 22:42:37 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:26.699 22:42:37 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:26.699 22:42:37 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:26.957 22:42:37 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:27.216 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:27.216 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.103 ms 00:35:27.216 00:35:27.216 --- 10.0.0.2 ping statistics --- 00:35:27.216 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.216 rtt min/avg/max/mdev = 0.103/0.103/0.103/0.000 ms 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:27.216 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:27.216 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.081 ms 00:35:27.216 00:35:27.216 --- 10.0.0.3 ping statistics --- 00:35:27.216 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.216 rtt min/avg/max/mdev = 0.081/0.081/0.081/0.000 ms 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:27.216 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:27.216 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.039 ms 00:35:27.216 00:35:27.216 --- 10.0.0.1 ping statistics --- 00:35:27.216 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:27.216 rtt min/avg/max/mdev = 0.039/0.039/0.039/0.000 ms 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@433 -- # return 0 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:27.216 22:42:37 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@481 -- # nvmfpid=3632519 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:27.216 22:42:37 chaining -- nvmf/common.sh@482 -- # waitforlisten 3632519 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@829 -- # '[' -z 3632519 ']' 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:27.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:27.216 22:42:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.474 [2024-07-12 22:42:37.565318] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:27.474 [2024-07-12 22:42:37.565385] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:27.474 [2024-07-12 22:42:37.690660] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:27.474 [2024-07-12 22:42:37.791643] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:27.474 [2024-07-12 22:42:37.791688] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:27.474 [2024-07-12 22:42:37.791703] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:27.474 [2024-07-12 22:42:37.791716] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:27.474 [2024-07-12 22:42:37.791727] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:27.474 [2024-07-12 22:42:37.791763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:28.407 22:42:38 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.407 22:42:38 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.JZr4t0zxW1 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.pnjBf4SKww 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.407 malloc0 00:35:28.407 true 00:35:28.407 true 00:35:28.407 [2024-07-12 22:42:38.530960] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:28.407 crypto0 00:35:28.407 [2024-07-12 22:42:38.538988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:28.407 crypto1 00:35:28.407 [2024-07-12 22:42:38.547106] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:28.407 [2024-07-12 22:42:38.563342] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:28.407 22:42:38 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.407 22:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.665 22:42:38 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.665 22:42:38 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:28.665 22:42:38 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.JZr4t0zxW1 bs=1K count=64 00:35:28.665 64+0 records in 00:35:28.665 64+0 records out 00:35:28.665 65536 bytes (66 kB, 64 KiB) copied, 0.00106743 s, 61.4 MB/s 00:35:28.665 22:42:38 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.JZr4t0zxW1 --ob Nvme0n1 --bs 65536 --count 1 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@25 -- # local config 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:28.666 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:28.666 "subsystems": [ 00:35:28.666 { 00:35:28.666 "subsystem": "bdev", 00:35:28.666 "config": [ 00:35:28.666 { 00:35:28.666 "method": "bdev_nvme_attach_controller", 00:35:28.666 "params": { 00:35:28.666 "trtype": "tcp", 00:35:28.666 "adrfam": "IPv4", 00:35:28.666 "name": "Nvme0", 00:35:28.666 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:28.666 "traddr": "10.0.0.2", 00:35:28.666 "trsvcid": "4420" 00:35:28.666 } 00:35:28.666 }, 00:35:28.666 { 00:35:28.666 "method": "bdev_set_options", 00:35:28.666 "params": { 00:35:28.666 "bdev_auto_examine": false 00:35:28.666 } 00:35:28.666 } 00:35:28.666 ] 00:35:28.666 } 00:35:28.666 ] 00:35:28.666 }' 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:28.666 "subsystems": [ 00:35:28.666 { 00:35:28.666 "subsystem": "bdev", 00:35:28.666 "config": [ 00:35:28.666 { 00:35:28.666 "method": "bdev_nvme_attach_controller", 00:35:28.666 "params": { 00:35:28.666 "trtype": "tcp", 00:35:28.666 "adrfam": "IPv4", 00:35:28.666 "name": "Nvme0", 00:35:28.666 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:28.666 "traddr": "10.0.0.2", 00:35:28.666 "trsvcid": "4420" 00:35:28.666 } 00:35:28.666 }, 00:35:28.666 { 00:35:28.666 "method": "bdev_set_options", 00:35:28.666 "params": { 00:35:28.666 "bdev_auto_examine": false 00:35:28.666 } 00:35:28.666 } 00:35:28.666 ] 00:35:28.666 } 00:35:28.666 ] 00:35:28.666 }' 00:35:28.666 22:42:38 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.JZr4t0zxW1 --ob Nvme0n1 --bs 65536 --count 1 00:35:28.666 [2024-07-12 22:42:38.863332] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:28.666 [2024-07-12 22:42:38.863400] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3632737 ] 00:35:28.925 [2024-07-12 22:42:38.992072] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:28.925 [2024-07-12 22:42:39.091780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.442  Copying: 64/64 [kB] (average 15 MBps) 00:35:29.442 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.442 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.442 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.701 22:42:39 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.pnjBf4SKww --ib Nvme0n1 --bs 65536 --count 1 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@25 -- # local config 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:29.701 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:29.701 "subsystems": [ 00:35:29.701 { 00:35:29.701 "subsystem": "bdev", 00:35:29.701 "config": [ 00:35:29.701 { 00:35:29.701 "method": "bdev_nvme_attach_controller", 00:35:29.701 "params": { 00:35:29.701 "trtype": "tcp", 00:35:29.701 "adrfam": "IPv4", 00:35:29.701 "name": "Nvme0", 00:35:29.701 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.701 "traddr": "10.0.0.2", 00:35:29.701 "trsvcid": "4420" 00:35:29.701 } 00:35:29.701 }, 00:35:29.701 { 00:35:29.701 "method": "bdev_set_options", 00:35:29.701 "params": { 00:35:29.701 "bdev_auto_examine": false 00:35:29.701 } 00:35:29.701 } 00:35:29.701 ] 00:35:29.701 } 00:35:29.701 ] 00:35:29.701 }' 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.pnjBf4SKww --ib Nvme0n1 --bs 65536 --count 1 00:35:29.701 22:42:39 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:29.701 "subsystems": [ 00:35:29.701 { 00:35:29.701 "subsystem": "bdev", 00:35:29.701 "config": [ 00:35:29.701 { 00:35:29.701 "method": "bdev_nvme_attach_controller", 00:35:29.701 "params": { 00:35:29.701 "trtype": "tcp", 00:35:29.701 "adrfam": "IPv4", 00:35:29.701 "name": "Nvme0", 00:35:29.701 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.701 "traddr": "10.0.0.2", 00:35:29.701 "trsvcid": "4420" 00:35:29.701 } 00:35:29.701 }, 00:35:29.701 { 00:35:29.701 "method": "bdev_set_options", 00:35:29.701 "params": { 00:35:29.701 "bdev_auto_examine": false 00:35:29.701 } 00:35:29.701 } 00:35:29.701 ] 00:35:29.701 } 00:35:29.701 ] 00:35:29.701 }' 00:35:29.701 [2024-07-12 22:42:39.993666] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:29.701 [2024-07-12 22:42:39.993714] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3632947 ] 00:35:29.959 [2024-07-12 22:42:40.110620] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:29.959 [2024-07-12 22:42:40.207494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:30.477  Copying: 64/64 [kB] (average 31 MBps) 00:35:30.477 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:30.477 22:42:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.JZr4t0zxW1 /tmp/tmp.pnjBf4SKww 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@25 -- # local config 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:30.477 22:42:40 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:30.477 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:30.736 22:42:40 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:30.736 "subsystems": [ 00:35:30.736 { 00:35:30.736 "subsystem": "bdev", 00:35:30.736 "config": [ 00:35:30.736 { 00:35:30.737 "method": "bdev_nvme_attach_controller", 00:35:30.737 "params": { 00:35:30.737 "trtype": "tcp", 00:35:30.737 "adrfam": "IPv4", 00:35:30.737 "name": "Nvme0", 00:35:30.737 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:30.737 "traddr": "10.0.0.2", 00:35:30.737 "trsvcid": "4420" 00:35:30.737 } 00:35:30.737 }, 00:35:30.737 { 00:35:30.737 "method": "bdev_set_options", 00:35:30.737 "params": { 00:35:30.737 "bdev_auto_examine": false 00:35:30.737 } 00:35:30.737 } 00:35:30.737 ] 00:35:30.737 } 00:35:30.737 ] 00:35:30.737 }' 00:35:30.737 22:42:40 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:30.737 22:42:40 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:30.737 "subsystems": [ 00:35:30.737 { 00:35:30.737 "subsystem": "bdev", 00:35:30.737 "config": [ 00:35:30.737 { 00:35:30.737 "method": "bdev_nvme_attach_controller", 00:35:30.737 "params": { 00:35:30.737 "trtype": "tcp", 00:35:30.737 "adrfam": "IPv4", 00:35:30.737 "name": "Nvme0", 00:35:30.737 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:30.737 "traddr": "10.0.0.2", 00:35:30.737 "trsvcid": "4420" 00:35:30.737 } 00:35:30.737 }, 00:35:30.737 { 00:35:30.737 "method": "bdev_set_options", 00:35:30.737 "params": { 00:35:30.737 "bdev_auto_examine": false 00:35:30.737 } 00:35:30.737 } 00:35:30.737 ] 00:35:30.737 } 00:35:30.737 ] 00:35:30.737 }' 00:35:30.737 [2024-07-12 22:42:40.899406] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:30.737 [2024-07-12 22:42:40.899481] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3633027 ] 00:35:30.737 [2024-07-12 22:42:41.029322] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:30.995 [2024-07-12 22:42:41.129696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:31.254  Copying: 64/64 [kB] (average 20 MBps) 00:35:31.254 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@106 -- # update_stats 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:31.254 22:42:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.254 22:42:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.254 22:42:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:31.254 22:42:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.513 22:42:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.JZr4t0zxW1 --ob Nvme0n1 --bs 4096 --count 16 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@25 -- # local config 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:31.513 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:31.513 "subsystems": [ 00:35:31.513 { 00:35:31.513 "subsystem": "bdev", 00:35:31.513 "config": [ 00:35:31.513 { 00:35:31.513 "method": "bdev_nvme_attach_controller", 00:35:31.513 "params": { 00:35:31.513 "trtype": "tcp", 00:35:31.513 "adrfam": "IPv4", 00:35:31.513 "name": "Nvme0", 00:35:31.513 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:31.513 "traddr": "10.0.0.2", 00:35:31.513 "trsvcid": "4420" 00:35:31.513 } 00:35:31.513 }, 00:35:31.513 { 00:35:31.513 "method": "bdev_set_options", 00:35:31.513 "params": { 00:35:31.513 "bdev_auto_examine": false 00:35:31.513 } 00:35:31.513 } 00:35:31.513 ] 00:35:31.513 } 00:35:31.513 ] 00:35:31.513 }' 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.JZr4t0zxW1 --ob Nvme0n1 --bs 4096 --count 16 00:35:31.513 22:42:41 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:31.513 "subsystems": [ 00:35:31.513 { 00:35:31.513 "subsystem": "bdev", 00:35:31.513 "config": [ 00:35:31.513 { 00:35:31.513 "method": "bdev_nvme_attach_controller", 00:35:31.513 "params": { 00:35:31.513 "trtype": "tcp", 00:35:31.513 "adrfam": "IPv4", 00:35:31.513 "name": "Nvme0", 00:35:31.513 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:31.513 "traddr": "10.0.0.2", 00:35:31.513 "trsvcid": "4420" 00:35:31.513 } 00:35:31.513 }, 00:35:31.513 { 00:35:31.513 "method": "bdev_set_options", 00:35:31.513 "params": { 00:35:31.513 "bdev_auto_examine": false 00:35:31.513 } 00:35:31.513 } 00:35:31.513 ] 00:35:31.513 } 00:35:31.513 ] 00:35:31.513 }' 00:35:31.513 [2024-07-12 22:42:41.833469] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:31.513 [2024-07-12 22:42:41.833525] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3633177 ] 00:35:31.774 [2024-07-12 22:42:41.946298] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:31.774 [2024-07-12 22:42:42.053246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:32.293  Copying: 64/64 [kB] (average 12 MBps) 00:35:32.293 00:35:32.293 22:42:42 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.294 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.294 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:32.552 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:32.553 22:42:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:32.553 22:42:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:32.811 22:42:42 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:32.811 22:42:42 chaining -- bdev/chaining.sh@117 -- # : 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.pnjBf4SKww --ib Nvme0n1 --bs 4096 --count 16 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@25 -- # local config 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:32.812 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:32.812 "subsystems": [ 00:35:32.812 { 00:35:32.812 "subsystem": "bdev", 00:35:32.812 "config": [ 00:35:32.812 { 00:35:32.812 "method": "bdev_nvme_attach_controller", 00:35:32.812 "params": { 00:35:32.812 "trtype": "tcp", 00:35:32.812 "adrfam": "IPv4", 00:35:32.812 "name": "Nvme0", 00:35:32.812 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:32.812 "traddr": "10.0.0.2", 00:35:32.812 "trsvcid": "4420" 00:35:32.812 } 00:35:32.812 }, 00:35:32.812 { 00:35:32.812 "method": "bdev_set_options", 00:35:32.812 "params": { 00:35:32.812 "bdev_auto_examine": false 00:35:32.812 } 00:35:32.812 } 00:35:32.812 ] 00:35:32.812 } 00:35:32.812 ] 00:35:32.812 }' 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:32.812 "subsystems": [ 00:35:32.812 { 00:35:32.812 "subsystem": "bdev", 00:35:32.812 "config": [ 00:35:32.812 { 00:35:32.812 "method": "bdev_nvme_attach_controller", 00:35:32.812 "params": { 00:35:32.812 "trtype": "tcp", 00:35:32.812 "adrfam": "IPv4", 00:35:32.812 "name": "Nvme0", 00:35:32.812 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:32.812 "traddr": "10.0.0.2", 00:35:32.812 "trsvcid": "4420" 00:35:32.812 } 00:35:32.812 }, 00:35:32.812 { 00:35:32.812 "method": "bdev_set_options", 00:35:32.812 "params": { 00:35:32.812 "bdev_auto_examine": false 00:35:32.812 } 00:35:32.812 } 00:35:32.812 ] 00:35:32.812 } 00:35:32.812 ] 00:35:32.812 }' 00:35:32.812 22:42:42 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.pnjBf4SKww --ib Nvme0n1 --bs 4096 --count 16 00:35:32.812 [2024-07-12 22:42:42.984147] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:32.812 [2024-07-12 22:42:42.984216] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3633387 ] 00:35:32.812 [2024-07-12 22:42:43.115455] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:33.070 [2024-07-12 22:42:43.214386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:33.588  Copying: 64/64 [kB] (average 1422 kBps) 00:35:33.588 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.588 22:42:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.JZr4t0zxW1 /tmp/tmp.pnjBf4SKww 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.JZr4t0zxW1 /tmp/tmp.pnjBf4SKww 00:35:33.588 22:42:43 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@117 -- # sync 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@120 -- # set +e 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:33.588 22:42:43 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:33.588 rmmod nvme_tcp 00:35:33.847 rmmod nvme_fabrics 00:35:33.847 rmmod nvme_keyring 00:35:33.847 22:42:43 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:33.847 22:42:43 chaining -- nvmf/common.sh@124 -- # set -e 00:35:33.847 22:42:43 chaining -- nvmf/common.sh@125 -- # return 0 00:35:33.847 22:42:43 chaining -- nvmf/common.sh@489 -- # '[' -n 3632519 ']' 00:35:33.847 22:42:43 chaining -- nvmf/common.sh@490 -- # killprocess 3632519 00:35:33.847 22:42:43 chaining -- common/autotest_common.sh@948 -- # '[' -z 3632519 ']' 00:35:33.847 22:42:43 chaining -- common/autotest_common.sh@952 -- # kill -0 3632519 00:35:33.847 22:42:43 chaining -- common/autotest_common.sh@953 -- # uname 00:35:33.847 22:42:43 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:33.847 22:42:43 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3632519 00:35:33.847 22:42:44 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:33.847 22:42:44 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:33.847 22:42:44 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3632519' 00:35:33.847 killing process with pid 3632519 00:35:33.847 22:42:44 chaining -- common/autotest_common.sh@967 -- # kill 3632519 00:35:33.847 22:42:44 chaining -- common/autotest_common.sh@972 -- # wait 3632519 00:35:34.105 22:42:44 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:34.106 22:42:44 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:34.106 22:42:44 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:34.106 22:42:44 chaining -- bdev/chaining.sh@132 -- # bperfpid=3633597 00:35:34.106 22:42:44 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:34.106 22:42:44 chaining -- bdev/chaining.sh@134 -- # waitforlisten 3633597 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@829 -- # '[' -z 3633597 ']' 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:34.106 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:34.106 22:42:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.106 [2024-07-12 22:42:44.404578] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:34.106 [2024-07-12 22:42:44.404648] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3633597 ] 00:35:34.364 [2024-07-12 22:42:44.579472] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:34.364 [2024-07-12 22:42:44.684110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:35.298 22:42:45 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:35.298 22:42:45 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:35.298 22:42:45 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:35.298 22:42:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:35.298 22:42:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:35.298 malloc0 00:35:35.298 true 00:35:35.298 true 00:35:35.298 [2024-07-12 22:42:45.432255] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:35.298 crypto0 00:35:35.298 [2024-07-12 22:42:45.440279] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:35.298 crypto1 00:35:35.298 22:42:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:35.298 22:42:45 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:35.298 Running I/O for 5 seconds... 00:35:40.593 00:35:40.593 Latency(us) 00:35:40.593 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:40.593 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:40.593 Verification LBA range: start 0x0 length 0x2000 00:35:40.593 crypto1 : 5.01 11434.96 44.67 0.00 0.00 22326.27 6382.64 14531.90 00:35:40.593 =================================================================================================================== 00:35:40.593 Total : 11434.96 44.67 0.00 0.00 22326.27 6382.64 14531.90 00:35:40.593 0 00:35:40.593 22:42:50 chaining -- bdev/chaining.sh@146 -- # killprocess 3633597 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@948 -- # '[' -z 3633597 ']' 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@952 -- # kill -0 3633597 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@953 -- # uname 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3633597 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3633597' 00:35:40.593 killing process with pid 3633597 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@967 -- # kill 3633597 00:35:40.593 Received shutdown signal, test time was about 5.000000 seconds 00:35:40.593 00:35:40.593 Latency(us) 00:35:40.593 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:40.593 =================================================================================================================== 00:35:40.593 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@972 -- # wait 3633597 00:35:40.593 22:42:50 chaining -- bdev/chaining.sh@152 -- # bperfpid=3634472 00:35:40.593 22:42:50 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:40.593 22:42:50 chaining -- bdev/chaining.sh@154 -- # waitforlisten 3634472 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@829 -- # '[' -z 3634472 ']' 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:40.593 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:40.593 22:42:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:40.852 [2024-07-12 22:42:50.947514] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:40.852 [2024-07-12 22:42:50.947584] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3634472 ] 00:35:40.852 [2024-07-12 22:42:51.069233] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:40.852 [2024-07-12 22:42:51.175055] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:41.789 22:42:51 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:41.789 22:42:51 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:41.789 22:42:51 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:41.789 22:42:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:41.789 22:42:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:41.789 malloc0 00:35:41.789 true 00:35:41.789 true 00:35:41.789 [2024-07-12 22:42:51.937881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:41.789 [2024-07-12 22:42:51.937936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:41.789 [2024-07-12 22:42:51.937957] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dd6730 00:35:41.789 [2024-07-12 22:42:51.937970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:41.789 [2024-07-12 22:42:51.939040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:41.789 [2024-07-12 22:42:51.939064] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:41.789 pt0 00:35:41.789 [2024-07-12 22:42:51.945910] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:41.789 crypto0 00:35:41.789 [2024-07-12 22:42:51.953938] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:41.789 crypto1 00:35:41.789 22:42:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:41.789 22:42:51 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:41.789 Running I/O for 5 seconds... 00:35:47.060 00:35:47.060 Latency(us) 00:35:47.060 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:47.060 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:47.060 Verification LBA range: start 0x0 length 0x2000 00:35:47.060 crypto1 : 5.02 8924.36 34.86 0.00 0.00 28607.49 6496.61 17210.32 00:35:47.060 =================================================================================================================== 00:35:47.060 Total : 8924.36 34.86 0.00 0.00 28607.49 6496.61 17210.32 00:35:47.060 0 00:35:47.060 22:42:57 chaining -- bdev/chaining.sh@167 -- # killprocess 3634472 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@948 -- # '[' -z 3634472 ']' 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@952 -- # kill -0 3634472 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@953 -- # uname 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3634472 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3634472' 00:35:47.060 killing process with pid 3634472 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@967 -- # kill 3634472 00:35:47.060 Received shutdown signal, test time was about 5.000000 seconds 00:35:47.060 00:35:47.060 Latency(us) 00:35:47.060 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:47.060 =================================================================================================================== 00:35:47.060 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@972 -- # wait 3634472 00:35:47.060 22:42:57 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:47.060 22:42:57 chaining -- bdev/chaining.sh@170 -- # killprocess 3634472 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@948 -- # '[' -z 3634472 ']' 00:35:47.060 22:42:57 chaining -- common/autotest_common.sh@952 -- # kill -0 3634472 00:35:47.060 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (3634472) - No such process 00:35:47.061 22:42:57 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 3634472 is not found' 00:35:47.061 Process with pid 3634472 is not found 00:35:47.061 22:42:57 chaining -- bdev/chaining.sh@171 -- # wait 3634472 00:35:47.061 22:42:57 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:47.061 22:42:57 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:47.061 22:42:57 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:47.061 22:42:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@336 -- # return 1 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:47.061 WARNING: No supported devices were found, fallback requested for tcp test 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:47.061 22:42:57 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:47.321 Cannot find device "nvmf_tgt_br" 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@155 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:47.321 Cannot find device "nvmf_tgt_br2" 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@156 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:47.321 Cannot find device "nvmf_tgt_br" 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@158 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:47.321 Cannot find device "nvmf_tgt_br2" 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@159 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:47.321 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@162 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:47.321 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@163 -- # true 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:47.321 22:42:57 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:47.582 22:42:57 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:47.842 22:42:57 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:47.842 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:47.842 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.114 ms 00:35:47.842 00:35:47.842 --- 10.0.0.2 ping statistics --- 00:35:47.842 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:47.842 rtt min/avg/max/mdev = 0.114/0.114/0.114/0.000 ms 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:47.842 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:47.842 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.084 ms 00:35:47.842 00:35:47.842 --- 10.0.0.3 ping statistics --- 00:35:47.842 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:47.842 rtt min/avg/max/mdev = 0.084/0.084/0.084/0.000 ms 00:35:47.842 22:42:58 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:48.102 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:48.102 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.040 ms 00:35:48.102 00:35:48.102 --- 10.0.0.1 ping statistics --- 00:35:48.102 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:48.102 rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@433 -- # return 0 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:48.102 22:42:58 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@481 -- # nvmfpid=3635610 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:48.102 22:42:58 chaining -- nvmf/common.sh@482 -- # waitforlisten 3635610 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@829 -- # '[' -z 3635610 ']' 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:48.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:48.102 22:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:48.102 [2024-07-12 22:42:58.297515] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:48.102 [2024-07-12 22:42:58.297586] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:48.102 [2024-07-12 22:42:58.423804] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:48.361 [2024-07-12 22:42:58.525083] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:48.361 [2024-07-12 22:42:58.525126] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:48.361 [2024-07-12 22:42:58.525141] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:48.361 [2024-07-12 22:42:58.525154] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:48.361 [2024-07-12 22:42:58.525165] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:48.361 [2024-07-12 22:42:58.525191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:48.929 22:42:59 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:48.929 22:42:59 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:48.929 22:42:59 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:48.929 22:42:59 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:48.929 22:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.189 22:42:59 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:49.189 22:42:59 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.189 malloc0 00:35:49.189 [2024-07-12 22:42:59.288185] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:49.189 [2024-07-12 22:42:59.304400] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:49.189 22:42:59 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:49.189 22:42:59 chaining -- bdev/chaining.sh@189 -- # bperfpid=3635802 00:35:49.189 22:42:59 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:49.189 22:42:59 chaining -- bdev/chaining.sh@191 -- # waitforlisten 3635802 /var/tmp/bperf.sock 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@829 -- # '[' -z 3635802 ']' 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:49.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:49.189 22:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.189 [2024-07-12 22:42:59.377121] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:49.189 [2024-07-12 22:42:59.377190] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3635802 ] 00:35:49.189 [2024-07-12 22:42:59.494703] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:49.449 [2024-07-12 22:42:59.591526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:50.018 22:43:00 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:50.018 22:43:00 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:50.018 22:43:00 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:50.018 22:43:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:50.587 [2024-07-12 22:43:00.658091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:50.587 nvme0n1 00:35:50.587 true 00:35:50.587 crypto0 00:35:50.587 22:43:00 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:50.587 Running I/O for 5 seconds... 00:35:55.862 00:35:55.862 Latency(us) 00:35:55.862 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:55.862 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:55.862 Verification LBA range: start 0x0 length 0x2000 00:35:55.862 crypto0 : 5.02 8213.95 32.09 0.00 0.00 31064.85 3020.35 24618.74 00:35:55.862 =================================================================================================================== 00:35:55.862 Total : 8213.95 32.09 0.00 0.00 31064.85 3020.35 24618.74 00:35:55.862 0 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:55.862 22:43:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@205 -- # sequence=82434 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:56.131 22:43:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@206 -- # encrypt=41217 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.390 22:43:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@207 -- # decrypt=41217 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.649 22:43:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.909 22:43:06 chaining -- bdev/chaining.sh@208 -- # crc32c=82434 00:35:56.909 22:43:06 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:56.909 22:43:06 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:56.909 22:43:06 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:56.909 22:43:06 chaining -- bdev/chaining.sh@214 -- # killprocess 3635802 00:35:56.909 22:43:06 chaining -- common/autotest_common.sh@948 -- # '[' -z 3635802 ']' 00:35:56.909 22:43:06 chaining -- common/autotest_common.sh@952 -- # kill -0 3635802 00:35:56.909 22:43:06 chaining -- common/autotest_common.sh@953 -- # uname 00:35:56.909 22:43:06 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:56.909 22:43:06 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3635802 00:35:56.909 22:43:07 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:56.909 22:43:07 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:56.909 22:43:07 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3635802' 00:35:56.909 killing process with pid 3635802 00:35:56.909 22:43:07 chaining -- common/autotest_common.sh@967 -- # kill 3635802 00:35:56.909 Received shutdown signal, test time was about 5.000000 seconds 00:35:56.909 00:35:56.909 Latency(us) 00:35:56.909 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:56.909 =================================================================================================================== 00:35:56.909 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:56.909 22:43:07 chaining -- common/autotest_common.sh@972 -- # wait 3635802 00:35:57.168 22:43:07 chaining -- bdev/chaining.sh@219 -- # bperfpid=3636768 00:35:57.168 22:43:07 chaining -- bdev/chaining.sh@221 -- # waitforlisten 3636768 /var/tmp/bperf.sock 00:35:57.168 22:43:07 chaining -- common/autotest_common.sh@829 -- # '[' -z 3636768 ']' 00:35:57.168 22:43:07 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:57.169 22:43:07 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:57.169 22:43:07 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:57.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:57.169 22:43:07 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:57.169 22:43:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.169 22:43:07 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:57.169 [2024-07-12 22:43:07.319696] Starting SPDK v24.09-pre git sha1 9b8dc23b2 / DPDK 24.03.0 initialization... 00:35:57.169 [2024-07-12 22:43:07.319771] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3636768 ] 00:35:57.169 [2024-07-12 22:43:07.450181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:57.427 [2024-07-12 22:43:07.546690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:57.997 22:43:08 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:57.997 22:43:08 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:57.997 22:43:08 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:57.997 22:43:08 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:58.601 [2024-07-12 22:43:08.644961] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:58.601 nvme0n1 00:35:58.601 true 00:35:58.601 crypto0 00:35:58.601 22:43:08 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:58.601 Running I/O for 5 seconds... 00:36:03.872 00:36:03.872 Latency(us) 00:36:03.872 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:03.872 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:03.872 Verification LBA range: start 0x0 length 0x200 00:36:03.872 crypto0 : 5.01 1686.94 105.43 0.00 0.00 18592.39 676.73 18919.96 00:36:03.872 =================================================================================================================== 00:36:03.872 Total : 1686.94 105.43 0.00 0.00 18592.39 676.73 18919.96 00:36:03.872 0 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:03.872 22:43:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@233 -- # sequence=16890 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:03.872 22:43:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@234 -- # encrypt=8445 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:04.131 22:43:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@235 -- # decrypt=8445 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:04.388 22:43:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:04.646 22:43:14 chaining -- bdev/chaining.sh@236 -- # crc32c=16890 00:36:04.646 22:43:14 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:04.646 22:43:14 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:04.646 22:43:14 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:04.646 22:43:14 chaining -- bdev/chaining.sh@242 -- # killprocess 3636768 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@948 -- # '[' -z 3636768 ']' 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@952 -- # kill -0 3636768 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@953 -- # uname 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3636768 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3636768' 00:36:04.646 killing process with pid 3636768 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@967 -- # kill 3636768 00:36:04.646 Received shutdown signal, test time was about 5.000000 seconds 00:36:04.646 00:36:04.646 Latency(us) 00:36:04.646 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:04.646 =================================================================================================================== 00:36:04.646 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:04.646 22:43:14 chaining -- common/autotest_common.sh@972 -- # wait 3636768 00:36:04.904 22:43:15 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@117 -- # sync 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@120 -- # set +e 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:04.904 rmmod nvme_tcp 00:36:04.904 rmmod nvme_fabrics 00:36:04.904 rmmod nvme_keyring 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@124 -- # set -e 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@125 -- # return 0 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@489 -- # '[' -n 3635610 ']' 00:36:04.904 22:43:15 chaining -- nvmf/common.sh@490 -- # killprocess 3635610 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@948 -- # '[' -z 3635610 ']' 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@952 -- # kill -0 3635610 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@953 -- # uname 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 3635610 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 3635610' 00:36:04.904 killing process with pid 3635610 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@967 -- # kill 3635610 00:36:04.904 22:43:15 chaining -- common/autotest_common.sh@972 -- # wait 3635610 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:05.161 22:43:15 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:05.161 22:43:15 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:05.161 22:43:15 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:05.161 22:43:15 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:05.161 00:36:05.161 real 0m45.555s 00:36:05.161 user 0m59.300s 00:36:05.161 sys 0m12.951s 00:36:05.161 22:43:15 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:05.161 22:43:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:05.161 ************************************ 00:36:05.161 END TEST chaining 00:36:05.161 ************************************ 00:36:05.419 22:43:15 -- common/autotest_common.sh@1142 -- # return 0 00:36:05.419 22:43:15 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:36:05.419 22:43:15 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:05.419 22:43:15 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:05.419 22:43:15 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:05.419 22:43:15 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:36:05.419 22:43:15 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:36:05.419 22:43:15 -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:05.419 22:43:15 -- common/autotest_common.sh@10 -- # set +x 00:36:05.419 22:43:15 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:36:05.419 22:43:15 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:05.419 22:43:15 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:05.419 22:43:15 -- common/autotest_common.sh@10 -- # set +x 00:36:09.607 INFO: APP EXITING 00:36:09.607 INFO: killing all VMs 00:36:09.607 INFO: killing vhost app 00:36:09.607 INFO: EXIT DONE 00:36:12.895 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:12.895 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:12.895 Waiting for block devices as requested 00:36:12.895 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:13.154 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:13.154 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:13.154 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:13.413 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:13.413 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:13.413 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:13.672 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:13.672 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:13.672 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:13.931 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:13.931 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:13.931 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:14.190 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:14.190 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:14.190 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:14.449 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:18.639 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:18.639 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:18.639 Cleaning 00:36:18.639 Removing: /var/run/dpdk/spdk0/config 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:18.639 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:18.639 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:18.639 Removing: /dev/shm/nvmf_trace.0 00:36:18.639 Removing: /dev/shm/spdk_tgt_trace.pid3381010 00:36:18.639 Removing: /var/run/dpdk/spdk0 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3380141 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3381010 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3381545 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3382274 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3382468 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3383296 00:36:18.639 Removing: /var/run/dpdk/spdk_pid3383398 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3383686 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3386578 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3388147 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3388371 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3388607 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3389019 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3389260 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3389459 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3389661 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3389885 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3390636 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3393330 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3393523 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3393770 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3393987 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3394174 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3394270 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3394578 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3394798 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3394997 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3395193 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3395396 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3395660 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3395946 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3396142 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3396345 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3396536 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3396738 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3397030 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3397294 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3397491 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3397687 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3397891 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3398148 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3398464 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3398663 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3398859 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3399087 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3399426 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3399804 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3400098 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3400382 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3400751 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3401118 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3401326 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3401555 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3401814 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3402283 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3402653 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3402844 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3406858 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3408526 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3410217 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3411111 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3412180 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3412603 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3412652 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3412709 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3416884 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3417441 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3418332 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3418653 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3423997 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3425486 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3426460 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3430617 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3432223 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3433151 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3437219 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3439748 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3441030 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3450556 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3452604 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3453582 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3463179 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3465370 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3466354 00:36:18.640 Removing: /var/run/dpdk/spdk_pid3476411 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3479846 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3480826 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3491582 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3494023 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3495434 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3506608 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3509035 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3510189 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3521020 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3525288 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3526678 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3527713 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3530976 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3536066 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3538508 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3542993 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3546395 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3552110 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3554814 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3560950 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3563365 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3569329 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3571748 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3578268 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3580636 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3584793 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3585148 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3585504 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3585858 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3586290 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3587062 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3587899 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3588243 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3590114 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3591894 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3593493 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3594795 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3596436 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3598114 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3599832 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3601561 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3602102 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3602476 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3604481 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3606325 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3608174 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3609232 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3610462 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3611004 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3611030 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3611256 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3611464 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3611655 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3612892 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3614402 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3615900 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3616612 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3617405 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3617696 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3617723 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3617845 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3618725 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3619326 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3619759 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3621770 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3623630 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3625976 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3627034 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3628267 00:36:18.899 Removing: /var/run/dpdk/spdk_pid3628807 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3628835 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3632737 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3632947 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3633027 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3633177 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3633387 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3633597 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3634472 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3635802 00:36:18.900 Removing: /var/run/dpdk/spdk_pid3636768 00:36:18.900 Clean 00:36:19.158 22:43:29 -- common/autotest_common.sh@1451 -- # return 0 00:36:19.159 22:43:29 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:19.159 22:43:29 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:19.159 22:43:29 -- common/autotest_common.sh@10 -- # set +x 00:36:19.159 22:43:29 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:19.159 22:43:29 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:19.159 22:43:29 -- common/autotest_common.sh@10 -- # set +x 00:36:19.159 22:43:29 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:19.159 22:43:29 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:19.159 22:43:29 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:19.159 22:43:29 -- spdk/autotest.sh@391 -- # hash lcov 00:36:19.159 22:43:29 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:19.159 22:43:29 -- spdk/autotest.sh@393 -- # hostname 00:36:19.159 22:43:29 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:19.417 geninfo: WARNING: invalid characters removed from testname! 00:36:46.031 22:43:56 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:50.228 22:43:59 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:52.772 22:44:02 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:55.304 22:44:05 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:57.839 22:44:07 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:00.377 22:44:10 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:02.915 22:44:13 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:37:02.915 22:44:13 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:02.915 22:44:13 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:37:02.915 22:44:13 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:02.915 22:44:13 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:02.915 22:44:13 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:02.915 22:44:13 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:02.915 22:44:13 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:02.915 22:44:13 -- paths/export.sh@5 -- $ export PATH 00:37:02.915 22:44:13 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:02.915 22:44:13 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:02.915 22:44:13 -- common/autobuild_common.sh@444 -- $ date +%s 00:37:02.916 22:44:13 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720817053.XXXXXX 00:37:02.916 22:44:13 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720817053.w1IpLr 00:37:02.916 22:44:13 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:37:02.916 22:44:13 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:37:02.916 22:44:13 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:37:02.916 22:44:13 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:37:02.916 22:44:13 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:37:02.916 22:44:13 -- common/autobuild_common.sh@460 -- $ get_config_params 00:37:02.916 22:44:13 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:37:02.916 22:44:13 -- common/autotest_common.sh@10 -- $ set +x 00:37:03.174 22:44:13 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:37:03.174 22:44:13 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:37:03.174 22:44:13 -- pm/common@17 -- $ local monitor 00:37:03.174 22:44:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:03.174 22:44:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:03.174 22:44:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:03.174 22:44:13 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:03.174 22:44:13 -- pm/common@25 -- $ sleep 1 00:37:03.174 22:44:13 -- pm/common@21 -- $ date +%s 00:37:03.174 22:44:13 -- pm/common@21 -- $ date +%s 00:37:03.174 22:44:13 -- pm/common@21 -- $ date +%s 00:37:03.174 22:44:13 -- pm/common@21 -- $ date +%s 00:37:03.175 22:44:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720817053 00:37:03.175 22:44:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720817053 00:37:03.175 22:44:13 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720817053 00:37:03.175 22:44:13 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720817053 00:37:03.175 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720817053_collect-vmstat.pm.log 00:37:03.175 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720817053_collect-cpu-load.pm.log 00:37:03.175 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720817053_collect-cpu-temp.pm.log 00:37:03.175 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720817053_collect-bmc-pm.bmc.pm.log 00:37:04.112 22:44:14 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:37:04.112 22:44:14 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:37:04.112 22:44:14 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:04.112 22:44:14 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:37:04.112 22:44:14 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:37:04.112 22:44:14 -- spdk/autopackage.sh@19 -- $ timing_finish 00:37:04.112 22:44:14 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:37:04.112 22:44:14 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:37:04.112 22:44:14 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:04.112 22:44:14 -- spdk/autopackage.sh@20 -- $ exit 0 00:37:04.112 22:44:14 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:37:04.112 22:44:14 -- pm/common@29 -- $ signal_monitor_resources TERM 00:37:04.112 22:44:14 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:37:04.112 22:44:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:04.112 22:44:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:37:04.112 22:44:14 -- pm/common@44 -- $ pid=3647427 00:37:04.112 22:44:14 -- pm/common@50 -- $ kill -TERM 3647427 00:37:04.112 22:44:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:04.112 22:44:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:37:04.112 22:44:14 -- pm/common@44 -- $ pid=3647428 00:37:04.112 22:44:14 -- pm/common@50 -- $ kill -TERM 3647428 00:37:04.112 22:44:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:04.112 22:44:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:37:04.112 22:44:14 -- pm/common@44 -- $ pid=3647429 00:37:04.112 22:44:14 -- pm/common@50 -- $ kill -TERM 3647429 00:37:04.112 22:44:14 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:04.112 22:44:14 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:37:04.112 22:44:14 -- pm/common@44 -- $ pid=3647453 00:37:04.112 22:44:14 -- pm/common@50 -- $ sudo -E kill -TERM 3647453 00:37:04.112 + [[ -n 3266208 ]] 00:37:04.112 + sudo kill 3266208 00:37:04.122 [Pipeline] } 00:37:04.144 [Pipeline] // stage 00:37:04.149 [Pipeline] } 00:37:04.170 [Pipeline] // timeout 00:37:04.175 [Pipeline] } 00:37:04.196 [Pipeline] // catchError 00:37:04.201 [Pipeline] } 00:37:04.219 [Pipeline] // wrap 00:37:04.225 [Pipeline] } 00:37:04.240 [Pipeline] // catchError 00:37:04.250 [Pipeline] stage 00:37:04.252 [Pipeline] { (Epilogue) 00:37:04.266 [Pipeline] catchError 00:37:04.268 [Pipeline] { 00:37:04.281 [Pipeline] echo 00:37:04.282 Cleanup processes 00:37:04.288 [Pipeline] sh 00:37:04.570 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:04.570 3647541 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:37:04.570 3647749 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:04.587 [Pipeline] sh 00:37:04.872 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:04.872 ++ grep -v 'sudo pgrep' 00:37:04.872 ++ awk '{print $1}' 00:37:04.872 + sudo kill -9 3647541 00:37:04.885 [Pipeline] sh 00:37:05.167 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:17.455 [Pipeline] sh 00:37:17.740 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:17.740 Artifacts sizes are good 00:37:17.756 [Pipeline] archiveArtifacts 00:37:17.763 Archiving artifacts 00:37:17.900 [Pipeline] sh 00:37:18.186 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:18.202 [Pipeline] cleanWs 00:37:18.213 [WS-CLEANUP] Deleting project workspace... 00:37:18.213 [WS-CLEANUP] Deferred wipeout is used... 00:37:18.220 [WS-CLEANUP] done 00:37:18.221 [Pipeline] } 00:37:18.240 [Pipeline] // catchError 00:37:18.255 [Pipeline] sh 00:37:18.541 + logger -p user.info -t JENKINS-CI 00:37:18.550 [Pipeline] } 00:37:18.570 [Pipeline] // stage 00:37:18.576 [Pipeline] } 00:37:18.594 [Pipeline] // node 00:37:18.600 [Pipeline] End of Pipeline 00:37:18.633 Finished: SUCCESS